An 82-year-old man was tricked out of thousands of dollars after scammers used artificial intelligence technology to pose as a Texas police officer over the phone, ABC 13's Erica Simon reported.
Jerry, identified only by his first name, told ABC 13 that he received a call on Oct. 21 from a stranger who introduced himself as Sgt. Matthew with the San Antonio Police Department. The caller told Jerry that Jerry's son-in-law, Michael Trueblood, was in jail for a serious accident that he caused. Another voice, which Jerry then thought belonged to his son-in-law, came on the line and asked Jerry to help pay for his release.
The scammers convinced Jerry to take out a total of $17,000 and hire a courier to deliver the cash from the assisted living facility where Jerry resided with his wife. Although Jerry wrote down the license plate numbers of the vehicles that drove off with his money, officers with the Sugar Land Police Department said leads to any linked courier services have turned up short.
"We really depended on that money. I'm going to have to get a job somewhere. H-E-B or something like that, to try to restore some of this money," Jerry said.
AI experts say there has been an uptick in similar schemes that rely on voice cloning devices over the past year. According to a FOX Business article by Eric Revell, advancements in technology have allowed criminals to better impersonate their targets' friends and families.
With audio samples that span as little as three to 10 seconds, scammers can realistically replicate a person's voice. These short clips can be taken from phone calls, social media, or other corners of the internet, Mike Scheumack, chief marketing and innovation officer with the identity protection company IDIQ, told Fox. "The goal of the scammer is to get you into fight or flight [mode] and to create urgency in your mind that your loved one is in some sort of trouble," he said. "So the best way to deal with those situations is to hang up and immediately call your loved one to verify if it’s them or not."
These cases also generally require multiple people to carry out, Scheumack added. "You have people that are researching on social media and gathering data about people. Those are not the same people that are going to plug in your voice. You have somebody else that’s going to clone the voice. You have somebody else that’s going to actually commit the act of calling. And you have somebody come to the victim’s house and pick up money if the scam is working."
As these operations proliferate, President Joe Biden recently signed an executive order calling for strict standards to manage AI safety and security. On top of directing software developers to release safety testing results and other critical information, he urged government agencies to set guidelines for improving cybersecurity and privacy, protecting U.S. citizens, and authenticating digital content.
"AI is all around us," Biden said, per an Associated Press report. "To realize the promise of AI and avoid the risk, we need to govern this technology."