Associated Incidents
Linda Roan was making herself a bowl of soup on a Monday evening in February when her cellphone rang. The call was from a local number, so she answered.
"Mom, I'm OK, but something awful has happened," a young woman sobbed. "I need your help."
Roan immediately recognized the voice: It sounded like the youngest of her three adult daughters.
Then a man came on. He mentioned Roan's daughter by name and asked if this was her mother. The man said her daughter had witnessed his drug deal. She had screamed at the sight of guns, scaring away the buyers. Now he was stuck with a lot of cocaine---and Roan's 26-year-old daughter. He said she was in his van.
"I got in this calm state," recalls Roan, 62, though she was petrified. "It was weird."
When the man said he was out a lot of money and needed her to make things right, it crossed her mind this could be a scam. But she had heard her daughter's voice and her distinctive cry. Hadn't she?
A nerve-racking drive
The man on the phone told Roan if she ever wanted to see her daughter again, she'd have to wire him money.
He said to go to a Walmart near her suburban Denver home and send money to Mexico via Western Union. He had prepared a story for her in case the clerk got suspicious. She was to say her brother-in-law lived in Mexico, had become very ill with Covid, and needed oxygen right away.
The Federal Trade Commission in March identified impostor scams---in which someone impersonates a loved one, colleague or government official---as the most-reported type last year, resulting in losses of nearly $3 billion.
Criminals increasingly use generative AI to mimic a loved one's voice, making these kinds of scams more believable, the Federal Bureau of Investigation has warned. It takes just three seconds of audio to clone a voice with 85% accuracy, according to the security-software firm McAfee, whose survey of 7,000 people globally found that more than half regularly share voice content online.
Criminals can also use AI to approximate the voice of someone of any age, gender or dialect. During a high-stress situation, a generic voice of a young woman could be confused for the voice of a daughter, according to cybersecurity experts.
As Roan headed to Walmart, the man advised her to drive carefully and obey the speed limit. He told her to put her phone on speaker and conceal it. He'd be listening the whole time, and if she went off script, he'd know. When she arrived, she secured her phone inside her shirt and went inside. But she was told she couldn't wire money using a credit card---only a debit card, which she didn't have.
The man on the phone instructed her to drive home and do a Western Union transfer online. He had timed her drive to Walmart and gave her 16 minutes, he said. If she stopped, he'd know.
In an apparent effort to keep her calm, the man chatted her up, asking what she likes to do for fun. She tried to be friendly in return, thinking it might help her daughter. But when she asked about him, he got mad. "You don't ask the questions," he yelled. "I'm the one that asks the questions."
When she got home, the man told her how to initiate a wire transfer online, but to complete it, she had to call a number. She told the agent the story about her brother-in-law with Covid and successfully wired $1,000 to Mexico.
A second shakedown
Fear itself can suspend our judgment in these situations, convincing us that what we hear is real, says Mary Poffenroth, a biopsychologist and lecturer at San Jose State University.
The amygdala---which regulates emotions and triggers our brain's fight-or-flight response---gets fired up. The prefrontal cortex, the reasoning and impulse control center, takes a back seat. "There can only be one bus driver," Poffenroth says. "That's why we don't make good decisions when we're scared."
After Roan wired the money, the man said he'd release her daughter. Then, following an audible commotion, the man got back on and said his boss was mad. It took so long to get the money, they now wanted more.
He said his boss could sell her daughter for $30,000. Roan heard what she thought was her daughter crying and screaming, "No, no, please let me go!" The man barked at her to be quiet.
Roan says she pleaded with him to let her talk to her daughter. He refused. Roan still wondered if the whole thing was a scam but...that voice. "It's not that it sounded almost like her," she says. "It sounded just like her."
He told Roan they could finally end things if she wired another $1,000 through MoneyGram, switching services to avoid suspicion. After the money went through, the man said he was letting Roan's daughter go and hung up.
Roan immediately dialed her daughter, but she didn't pick up. She tried again. And again. On the fourth try, her daughter answered. "Mom, what is going on?" She was in her apartment, safe and unaware of what Roan had been through. Roan sobbed with relief.
In the McAfee survey, one in 10 respondents said they had received similar calls, most losing money as a result.
After she calmed down, Roan called the police. An officer came to her house and reviewed her transactions. He searched a database for the mystery caller's phone number. Nothing came up. Same for the names of the money-transfer recipients.
According to the police report, Roan was told getting enough evidence to prosecute was unlikely since the money went to Mexico. The officer advised Roan to dispute the charges, but the credit-card company's agent said she couldn't get her money back because she authorized the charges. Her bank told The Wall Street Journal this is standard practice.
Roan says she feels deeply embarrassed that she was fooled. But when she described the ordeal to friends, they were sympathetic.
"Every mother I've talked to said they would have done the same thing," Roan says.