News

Florida Woman Loses $15K in AI Voice Scam Mimicking Daughter

In a disturbing case of AI voice cloning fraud, a Florida woman was tricked into handing over $15,000 after scammers used artificial intelligence to mimic her daughter's voice. Sharon Brightwell received a call from someone who sounded exactly like her daughter, crying and claiming she had caused a car accident that injured a pregnant woman. The caller, pretending to be her daughter, said her phone was taken by police and she was being detained.

Moments later, a man posing as an attorney took over the call, telling Sharon her daughter needed $15,000 for bail. He advised her not to disclose the reason for the large withdrawal, claiming it could affect her daughter’s credit. Following the instructions, Sharon placed the money in a box, which was collected by a courier. Later, another call demanded $30,000 more to settle with the victim’s family, claiming the unborn child had died.

Fortunately, Sharon’s grandson called her daughter, who was safe at work—revealing the scam. But by then, the $15,000 was gone. “That was our retirement savings,” Sharon said.

AI-generated voice scams are becoming more convincing and accessible. Criminals often use voice clips from social media to create replicas, and emotional manipulation makes these scams harder to detect.

How to protect yourself:

  • Don’t answer unknown calls.

  • Limit posting audio or video online.

  • Create and use a private family password.

  • Confirm emergencies directly using known contact numbers.

  • Involve a trusted person before taking action.

  • Report incidents to local authorities or consumer protection agencies.

Staying cautious and verifying claims through alternate channels is the best defense against such AI-enabled scams.