A PRO surfer has revealed she’s in “emotional hell” after blackmailers sent her AI-generated nude images of herself.
Mariana Rocha, 26, from Portugal, received a text from a strange number threatening to send the convincing fakes to her family.
Jam Press/@marianarochaassisPro surfer Mariana Rocha said the worst thing is how realistic the fake nudes appear[/caption]
Jam Press/@marianarochaassisMariana received a string of threats from the deepfake scammer[/caption]
Mariana said she was asked to send £4,290 (€5,000) – but she refused.
She has since come forward to say: “I would like to share my story because I believe there are more victims like me suffering from this new type of harassment.
“Today I saw the dark side of this scary world.
“I’m going to have photos of myself circulating online.
“And the scariest thing is that it’s happening to a lot of people.”
She revealed that blackmailer proceeded to threaten her, saying that the images would be sent to her family, friends, sponsors and clients.
Even when the pro surfer said she did not have the money, the scammer continued to harass and abuse her.
She has shared a screenshot of the abuse she received alongside her message about deepfakes.
Deepfake videos are made using a blend of artificial intelligence and computer imagery to create a manipulated version of a real person.
The technology can create convincing but fictional photos or videos from scratch.
Voice clones can also be dubbed into the video for added authenticity.
The fake nudes have taken a toll on the Portuguese surfer, who described her ordeal as emotional hell.
But she has bravely spoken out in the hope that it gives others confidence in navigating these new forms of online abuse and harassment.
She added: “So watch out because these f***ing hackers are destroying lives, destroying dreams.
“As much as I believe that Artificial Intelligence has its good points, it’s frightening where we’re heading.”
She also pleaded for world leaders to “do something” about AI, as she believes it is “heading for a very, very scary place”.
This comes after a tech expert warned about how deepfakes could scam victims out of their money.
The World Economic Forum (WEF) has estimated that deepfake videos are increasing at an annual rate of 900 per cent.
Amit Gupta, Vice President of Product Management, Research, and Engineering at cybersecurity company Pindrop, told the Sun that deepfakes are an exponentially growing point of concern.
This is largely due to the fact that people are currently only able to identify a deepfake with 57 per cent accuracy.
But Gupta explains that the chances of falling victim to a deepfake crime depend on multiple factors.
“Individuals who are less aware of the existence and sheer capabilities of deepfake technology might be more likely to fall for a deepfake crime,” he said.
And another factor that plays into the likelihood of falling victim to deepfakes is exposure risk.
For example, individuals in high-profile positions, such as politicians or celebrities tend to “face a higher risk due to the greater impact of their manipulated public image.”
However, as deepfake technology becomes more accessible, we will likely see more deepfake-related crimes, experts say.
Gupta estimates that we will see an influx of highly realistic videos, audio recordings, and images of politicians and presidential candidates in the next 12 months.
He added that this is raising concerns about the potential to sway public opinion and make them believe events, statements, or actions that never actually occurred.
Jam Press/@marianarochaassisThe scammer threatened to send the fake nudes to Mariana’s family, friends, sponsors and clients[/caption]
Leave a comment