The agency advised people to always attempt to contact their loved ones before considering paying the ransom.
Criminals are altering images of people obtained from social media or other public sites to create fake “proof of life” photos as part of virtual kidnapping for ransom scams, the FBI said in a public service announcement on Dec. 5.
According to the agency, criminal actors typically get in touch with their targets via text messages, claiming to have kidnapped people close to them and demanding ransom payments. The demands are often accompanied by threats of violence.
The photo or video will, upon close inspection, reveal inaccuracies, with examples including “missing tattoos or scars and inaccurate body proportions,” the FBI said. The messages will have a sense of urgency—sent out using timed message features so family members do not have sufficient time to analyze the details.
Instead of reacting hastily, people who receive such communication should stop and think whether the kidnapper’s claims “make sense,” the notice said.
The agency advised people to always attempt to contact their loved ones before considering paying the ransom. A code word, known only within their close circle, can be crucial.
Moreover, this type of fake image ransom scammer will use missing person information found online. The agency advised people to immediately take a screenshot or record any “proof of life” photos they receive.
AI-Powered Kidnapping
In a June 16 statement submitted to a House committee hearing, JB Branch, a technology accountability advocate at the nonprofit Public Citizen, highlighted the risks posed by artificial intelligence (AI) tools.
“From phishing emails generated in perfect English to deepfake videos impersonating family members, AI tools are being weaponized in ways that exploit trust, erode safety, and overwhelm law enforcement,” Branch said.
“In the era of AI, a single tool can now produce thousands of personalized phishing attacks or clone a victim’s voice in seconds. This proliferation has quickly outpaced previous technology, and the models can adapt quickly.”
Branch said local police forces have issued warnings about a rise in virtual kidnapping scams that make use of cloned voices. He highlighted a case involving a mother from Arizona who was nearly duped by such a scam.







