'Living a lifelong sentence': How AI is trapping women in a deepfake porn hell
Noelle Martin was just 18 when she discovered that pornographic pictures of her were being circulated online. She never recalled taking, let alone sharing, intimate images. However, that was her face in those images - the body, however, wasn’t hers.
She became a victim of what would later be known as "deepfakes". Pornographic pictures had been manipulated to look like her by using images she had shared on her personal social media accounts.
"This is a lifelong sentence," Martin told Euronews Next. "It can destroy people's lives, livelihoods, employability, interpersonal relationships, romantic relationships. And there is very, very little that can be done once someone is targeted".
Deepfakes are digitally altered videos or images created to depict someone in fake scenarios. While deepfake technology can theoretically be used for more lighthearted, satiric or well-intended purposes, a 2019 Deeptrace Labs report found that 96 per cent of deepfake content online is non-consensual pornography.
Just this week, a 22-year-old man in New York was sentenced to six months in jail for posting deepfake porn photos of former school classmates using teenage pictures of them taken from their social media accounts.
Back in 2013, when the same happened to her, Martin tried to limit the damage by going to the police and asking for the images to be taken down. However, there was nothing they could do considering there were no laws against the dissemination of intimate images at the time in Australia, where she lives.
"Even if you can try and take things down, if you're a victim of this, you’ve still got issues to do with holding whoever is responsible to account. Because taking things down from public sites, from these websites or from wherever


