Jennifer Aniston used in deepfake scam

A British man was scammed by a deepfake of Jennifer Aniston in a romance scheme. Learn how AI and social engineering power modern phishing attacks.

03-07-2025 - 3 minute read. Posted in: cybercrime.

Jennifer Aniston used in deepfake scam

Scammers use deepfake Jennifer Aniston in AI romance scam

A British man has fallen victim to a cybercriminal scheme involving deepfake technology and emotional manipulation. In a disturbing case that reflects the evolving landscape of cyber threats, fraudsters used artificial intelligence to impersonate actress Jennifer Aniston and lure the victim into a fake romantic relationship.

AI-generated celebrities and emotional manipulation

Over a five-month period, 43-year-old Paul Davis from Southampton was targeted by scammers who created AI-generated videos and audio of several public figures, including Jennifer Aniston, Elon Musk and Mark Zuckerberg. The fake Aniston regularly sent video messages and voice notes, calling him "my love" and expressing affection in a tone that mimicked the actress's real voice.

The scammers even produced a fake driver’s license as supposed proof of identity. The combination of realistic visuals, consistent communication and affectionate language made the scam feel authentic and convincing.

The financial hook

Eventually, the impersonated Aniston asked Davis for help paying her Apple subscriptions. She requested £200 in Apple gift cards, which the victim sent. While the financial loss was relatively small, the scam had a significant emotional impact.

This type of scam is part of a growing trend known as romance phishing, where cybercriminals build emotional relationships with victims to exploit them financially. By adding deepfake technology to the mix, the attackers increase their chances of success.

A dangerous combination: social engineering and AI

This case highlights the power of combining advanced AI tools with classic social engineering techniques. Deepfake technology allows scammers to impersonate real people with alarming accuracy. Public figures are especially vulnerable since there is an abundance of video and audio content online that can be used to train AI models.

In this case, Jennifer Aniston had no involvement. Her digital likeness was used without her knowledge or consent, illustrating the ethical risks and privacy concerns associated with generative AI.

Learn more about how deepfakes can impact businesses and individuals in our full guide here, and explore how attackers exploit human psychology in our article on social engineering.

Prevention and protection

The rise of AI-driven scams makes it harder to detect fraud using traditional cues like spelling mistakes or strange formatting. Here are a few important tips to help protect yourself:

  • Be cautious of anyone expressing strong emotions early in a relationship, especially online.

  • Verify identities using official or trusted sources before engaging further.

  • Never send gift cards or money to someone you have not met in person.

  • Look for inconsistencies in facial expressions, lighting or voice patterns that might suggest a video is AI-generated.

  • Report suspicious behavior to the appropriate platforms or local authorities.

Looking ahead

This case is not an isolated incident. As AI tools become more accessible, scammers are using them to create increasingly believable deceptions. Cybercrime is entering a new phase where synthetic media and emotional manipulation are used together to exploit victims more effectively.

At Moxso, we believe awareness is the strongest defense. Understanding how these scams work and staying vigilant online is key to preventing future attacks. Cybersecurity is no longer just about protecting data – it is about protecting trust.

Author Sarah Krarup

Sarah Krarup

Sarah studies innovation and entrepreneurship with a deep interest in IT and how cybersecurity impacts businesses and individuals. She has extensive experience in copywriting and is dedicated to making cybersecurity information accessible and engaging for everyone.

View all posts by Sarah Krarup

Similar posts