Advertisement

Rise of the post-truth sex tape: Deepfake pornography is making women’s online lives even more frightening

The vast majority of deepfakes created and circulated online had no political motive. Most of them were, and still are, porn (iStock)
The vast majority of deepfakes created and circulated online had no political motive. Most of them were, and still are, porn (iStock)

A man sits close to a microphone stand. Tears fill his eyes. He puts his head in his hands. “It’s so embarrassing,” he says. “It’s gross and I’m sorry.” Behind him, his wife wipes away tears with a crumpled tissue.

A few days earlier, Brandon Ewing – known as “Atrioc” to his followers on the streaming platform Twitch – had shared his screen during a live video. One of the visible open tabs showed a paid-access porn site specialising in AI-generated sexual images, otherwise known as “deepfakes”. Viewers of the stream screenshotted the leak, then widely shared the site, images from it, and the names of the women who were deepfaked. Ewing had accidentally revealed he had been looking at deepfake pornography of two other popular Twitch streamers, Maya Higa and Pokimane. Neither had consented to their likenesses being used for sexually explicit material.

“I got morbidly curious and I clicked something,” Ewing said in his apology video. He insisted he’d navigated to a deepfake site from an advert on Pornhub, which took him to another subscriber-only website. There he paid to view the doctored images of the female streamers. “There’s an ad on every f***ing video, so I know other people must be clicking it,” he said. He also denied that his viewing of non-consensual porn was a “pattern of behaviour”. Yet despite clearly attempting to write off the incident as a dumb rookie mistake that would be easy for anyone to make, he also declared his behaviour “embarrassing” and “disgusting”.

It has been more than five years since users of the social media site Reddit started using AI to perform face swaps – digitally stitching one person’s head to another’s body, making their likeness move and even speak in ways the real individual never has. Back in 2018, the media reaction to these new “deepfakes” seemed to spin around their potential for political hoaxes. The dominant concern was that this rapidly advancing technology – which had the power to literally put words in the mouths of any public figure on the planet – would plunge us ever deeper into a post-truth existence, and a new era of AI-generated spin. Yet right from the start, the vast majority of deepfakes created and circulated online had no political motive. Most of them were, and still are, porn. Indeed, the term “deepfakes” itself comes from one Reddit user who went by that moniker in a celebrity doppelganger porn forum, where he posted AI-generated mash-ups of celebrity faces on porn performer bodies. The result: realistic-looking videos of people having sex, which never actually happened.

Today, what began as one man’s pervy hobby has taken over the internet. According to government statistics, a website that virtually undresses photos of women received 38 million hits in the first eight months of 2021. As Ewing himself put it, “there’s an ad on every f***ing video” on Pornhub. It’s not only celebrities who are targets, either, or even widely recognised streamers in the Twitch community. A vast proportion of the deepfake porn marketplace – where people pay for AI-generated sexual imagery of specific individuals – are of regular people. Colleagues. Friends. A girl someone follows on Instagram. Someone’s family member. With just a few clicks, it’s possible to make, share and profit from pornographic videos starring non-consenting people, using only a front-facing photo of their face. While the motivation behind making, commissioning or consuming deepfake pornography may be sexual gratification, or for a laugh, or a twisted desire for control and power over women’s bodies, often the imagery is also used to intimidate and harass the women whose likenesses have been appropriated. Essentially, deepfakes are post-truth sex tapes – artificial intimate images weaponised to tarnish women’s reputations, silence them, or extort money. They might be “fake”, but they cause very real harm.

In the wake of last week’s Twitch scandal, a number of content creators have spoken out – many of whom didn’t know they’d also been featured on deepfake porn sites. Pokimane, one of the streamers shown on Ewing’s deepfake porn tab, tweeted “stop sexualising people without their consent”. Another streamer known as BrookeAB asked “how long are we going to accept that being stalked, harassed, sexualised ... is just ‘part of being a woman on the internet’ and ‘what we signed up for’? When are we going to take the steps to change this? Why is this normalised?” In a now widely shared video, streamer QTCinderella – the partner of Ewing’s friend and business associate Ludwig Ahgren – similarly argued that she shouldn’t have to fight and pay to get deepfake content taken offline. “It should not be part of my job to be harassed, to see pictures of me ‘nude’ spread around,” she insisted through tears. The fact that it is, she added, “is exhausting”.

This isn’t about AI or technology at all but about consent and how women’s bodies are thought of as fair game

QTCinderella also spoke directly to people currently consuming deepfake porn. “If you are able to look at women that are not selling themselves or benefitting off being seen sexually ... [or] platforming it themselves ... you are the problem,” she said. “You see women as objects. You should not be OK doing that.”

Yet, rather than acknowledging these women’s pain at having their images manipulated and distributed, or recognising the central issue of consent, many responses displayed a disturbing lack of empathy. More didn’t seem to understand the realities of sexual harassment. Replies to Pokimane often seemed to accuse her of hypocrisy, with one asking “isn’t sexualising yourself why you are popular?” Another demanded she “stop telling people how to live their lives”, and argued “all of your success is from being sexualised”. One Twitter user posted a screenshot from QTCinderella’s video of her crying, and wrote “millionaire internet streamer’s reaction to AI porn of herself. You won’t find more fragile people than popular internet personalities (especially women)”. Perhaps even more concerningly, comments on Ewing’s apology video seemed to dismiss the whole thing with a “boys will be boys” attitude. “Me watching streamers get cancelled for things I do every day”, one wrote, while others responded “he just like me”, and “one of us one of us one of us”.

The misogyny of these responses is crystal clear. Women are fair game, they say, because they are rich, popular, pretty, or simply online. If they get upset, it’s because they are “fragile,” weak, or overly sensitive. The implication is that they are “taking it too personally”, despite the fact that what is being sold and traded is, literally, their personhood. Non-consensual taking and sharing of intimate images is abuse. Yet it seems that many people still refuse to see deepfake porn as “real” abuse with real-world consequences.

Part of the problem is that laws currently lag behind both technology and reality. Often the creators and distributors of deepfake porn can only be pursued if their sharing of doctored images goes hand in hand with blackmail or other forms of harassment. This can leave those who are targeted with little choice but to attempt to have content taken down themselves.

There are signs, however, that deepfake porn is finally beginning to be taken seriously by lawmakers. Last autumn, the British government announced that it would be criminalising non-consensual deepfake pornography, as part of the controversial Online Safety Bill, which returned to parliament in December after several years-long delays. There are still a number of potential stumbling blocks, though. When the bill returned to parliament, journalist Brit Dawson wrote for Dazed that “delays could mean the laws won’t even be fit for purpose when or if they’re eventually enacted”. As Dawson also indicated then, “intimate-image abuse legislation has previously fallen victim to loopholes, including focusing on ‘intent to cause distress or humiliation’, rather than consent, which has led to a lack of prosecutions.”

Twitch streamer QTCinderella breaks down in tears as she discusses her image being used in deepfake porn (Twitch/QTCinderella)
Twitch streamer QTCinderella breaks down in tears as she discusses her image being used in deepfake porn (Twitch/QTCinderella)

Two months on, a handful of politicians and peers are voicing exactly this concern. Outlining a plan to put forward an amendment to the bill, a group of peers suggested that the government’s plans for new offences such as cyberstalking and the sharing of intimate images do not go far enough and will fail to stop online misogynistic abuse. Earlier this month, shadow culture secretary Lucy Powell similarly told The DailyTelegraph that the bill had been “severely weakened” by the Conservative government’s decision to ditch plans to outlaw online material that is judged as “legal but harmful” – such as content involving self-harm or eating disorders, and misogynistic posts. Powell said this change to the bill left “viral misogyny free to proliferate”.

Deepfake porn can often feel like an intractable problem, as technology advances far quicker than legislature can be pushed through. Yet as laws attempt to play catch-up, it’s even more important that culture more broadly undergoes a shift. Because, really, this isn’t about AI or technology at all but about consent and how women’s bodies are thought of as fair game.

At around the same time Ewing was releasing his apology video, the Pamela Anderson documentary Pamela: A Love Story was released on Netflix. In it, Anderson discussed the 1995 sex tape of her and her then-husband Tommy Lee, which was stolen from the pair’s home, edited and sold online against their will. She spoke of the devastating toll it had on her, and how it took away the sexuality she’d just reclaimed for herself after being sexually abused as a child. Anderson was made to be the butt of jokes; a caricature and punchline. Now, decades later, perhaps we’d like to think that culture has moved on – that it’s less sexist and cruel, and that women are treated better. Unfortunately, the prevalence of deepfake porn, and the casual attitude so many people seem to have towards it, seems to suggest we haven’t learnt much at all.