AI 'deepfake' videos make investment scams harder to spot as Americans lose billions

Scammers remain as active as ever, and artificial intelligence and other sophisticated tools raise the risks for potential victims.
Americans were swindled out of an estimated $12.5 billion online last year, up from $10.3 billion in 2022, according to the FBI’s Internet Crime Complaint Center, but the totals could be much higher. The FBI cracked one case in which it found only 20% of the victims had reported these crimes.
Scammers continue to move the goalposts with different ruses and techniques, and AI is figuring more prominently. The FBI estimates 39% of victims last year were swindled based on “deepfake” or doctored videos that were altered using AI technology to manipulate or misrepresent what someone actually did or said. The videos are being used in investment scams, such as romance swindles, and in other ways.
“This national crisis is likely to get worse in the years ahead as scammers can now use artificial intelligence to create deepfake videos of business leaders, celebrities, politicians and romantic suitors that are difficult to detect,” said Social Catfish, a company that helps people combat online crimes, especially those of a romantic nature, by verifying photos using reverse-image searches.
The role of AI in financial scams
Scammers can use AI technology to duplicate voices and trick people into sending money or revealing personal information by pretending to be family members, co-workers or friends, said Christian Romero, a community manager at Chase. He spoke to roughly 100 residents during an anti-fraud meeting Oct. 10 in Sun City West. Arizona ranks fifth nationally in both online complaints per capita and monetary losses per capita, according to the FBI report.
Using AI, scammers can process larger quantities of data and thus try more password combinations in an attempt to break into a victim's account, Romero added. To discourage this, everyone should use strong passwords, change them frequently and use two-factor authentication, he said.
Americans filed more than 880,000 online fraud complaints last year with the FBI's Internet Crime Complaint Center. Social Catfish estimates that 96% of the money reported lost was never recovered, partly because most scammers live overseas. Many crooks demand payment with cryptocurrencies, which figure in most investment-related crimes. Some crypto schemes involve crooks offering to help victims recover money lost in prior crypto swindles.
The FBI classified the highest proportion of online crimes last year in the investment category, covering a broad range of topics but with the common theme of pitches touting high returns. Another prominent category involved scams where crooks infiltrate business email accounts and use the information obtained, including contact listings, to demand payment for various services.
Also common are ransomware schemes, where crooks infect victims’ computers and demand payments to unlock the computers and the information stored on them. Then there are technology/customer support crimes where crooks convince victims that their computers are infected and promise to clean them for a fee.
Ways to spot a deepfake video
Online crimes involving deepfake or doctored videos are fairly new. Here are some ways you can try to spot falsified videos or photos, according to Social Catfish:
Unnatural movements: Look for subtle irregularities in videos that involve facial movements or expressions, such as in how the eyes and mouth move. Focus on potential mismatches between the words spoken and lip movements.
Inconsistent lighting and shadows: Pay attention to lighting, as doctored videos often struggle to maintain consistency, especially if material from different sources has been combined. Unusual skin tones might be a clue.
Audio irregularities: Listen for changes in the tone or quality of a person’s voice. The sounds might seem out of sync, and the audio might seem flat or emotionless.
Try to authenticate a video by checking with the source that supposedly created it. On the romance front, potential victims could be carrying on a conversation with a person who doesn’t exist, at least in the way portrayed. Always insist on meeting in person before making any financial or other commitments; if the other person balks or claims to be living in another country, consider that a red flag.
Other scams still prevalent
Even with the advent of deepfake videos and other technologically sophisticated schemes, crooks still rely on many of the methods they have used in the past.
One scam involves crooks claiming to know about an arrest warrant for a potential victim and requesting, say, $500 in gift cards to resolve it, said Capt. Brian Stutsman of the Maricopa County Sheriff's Office, speaking at the Sun City West anti-fraud event. Another is the "grandparent" scam where potential victims are asked for bail money to free a grandchild supposedly being held in a Mexican jail.
“Don’t give anything to anyone over the phone," Stutsman advised. None of the kidnapping scams that the Sheriff's Office investigated turned out to be authentic, he added, yet some victims were tricked into handing over money. Seniors in particular tend to be trustworthy and polite when solicitors contact them over the phone, yet "It's OK to be rude and hang up,” he said.
Lance Hunzeker, a financial crimes deputy in the Maricopa County Sheriff's Office, emphasized that it's mostly incoming calls, text messages and emails that people need to be vigilant about.
"Anybody can be a victim of scams and fraud," said Romero. "I’ve been a victim of fraud and I’ve been scammed, and I’m a banker (and) know how to protect myself."
Rather than feeling shamed or embarrassed, he said, it's important for victims to turn to bankers and others who have training and resources to help. It's also important to report crimes to law enforcement authorities and, above all, to remain skeptical.
Reach the writer at russ.wiles@arizonarepublic.com.