Associated Incidents

AUSTIN, Texas — In a world where connections are just a video call away, it’s easy to trust the face on your screen. Whether dating, networking, or just catching up, FaceTime and social media video calls help bridge the distance-- but what if that face is a lie?
With the help of artificial intelligence, scammers can make impersonation scams look more realistic than ever, according to the Better Business Bureau.
Cherelle Kozak is an aspiring young artist and a director here at CBS Austin News. A scammer specifically targeted her for one of these scams.
“I got a text at about, it was I think about three, almost four, and it said, hey, I really like your music,” Kozak said, remembering when this scammer first reached out.
“You're dope, do you know who this is?” she recalled. “I was like, no, I don't know who this is, and they were like, it's Fat Joe.”
The person on the other end was pretending to be chart-topping rapper Fat Joe.
“I get on the phone with him and it's actually him in the studio saying yo yo yo, I'm in the studio,” Kozak said.
This “Fat Joe” scammer told Kozak he wanted to submit one of her songs to a radio station, and if that song did well, he would sign her onto his platform. Kozak believed in this moment that this could be her big break. She was sent a link to a portal, where she uploaded her songs.
Immediately after, she said she got a message asking for money to play her songs on the radio, and that is when she started putting the pieces together. A quick Google search and a social media post from Fat Joe himself alerting his fans about the scam going around confirmed her gut feeling.
The Senior Director of Media Relations for Better Business Bureau, Jason Meza, said he sees impersonation scam reports all the time.
“We’ve seen so many examples of spoofed impersonation scams involving CBP (U.S. Customs and Border Protection), Border Patrol, law enforcement, politicians, religious leaders, and yes, celebrities.”
Meza agrees AI is making these types of scams much harder to spot.
“They can mimic your voice and your lips and make it look like you're seemingly saying something when in actuality you're not,” he said. “They look real. I might not even be real at this point.”
According to Meza, Texans lost more than $2 million to impersonation scams in 2024.
So, how are scammers able to make these calls, the video and audio, so realistic? It just takes a three-second clip of anyone online, according to Meza. That clip then gets put through software and mapped onto the scammer’s face.
“We’re telling people just be leery of oversharing, resist the urge to act immediately,” said Meza. “Double, triple check the people that you’re speaking with virtually and ensure that you’re reporting your experiences.
Meza says to file a report to the BBB if you have lost money to one of these scams. It might help them prosecute the bad actors involved.
“We're putting together a massive puzzle, and the more names we have, the more experiences,” Meza said. “You might be the missing link, so don't feel ashamed to come forward.”
As for Kozak, she luckily did not give the scammer contacting her any money, but she does have a message to the person behind the artificial intelligence:
“You're just shortening your blessings and everything, taking that moment because people are really passionate and put their hardworking money into different things. Playing with people is not cool.”
This a good reminder to us all to always trust our gut, and put in the extra work to ensure you are talking to the person that you think you are.
It could save you time, money, and frustration.