Disturbing AI trend with female photos: 'Delete it now'

A leading Aussie tech expert has warned it's already too late to protect ourselves.

As calls mount for legislative change to combat deep-fake AI pornography, an Australian technology expert has warned the "scary" trend is "pretty much unstoppable"— and as long as anyone has even a minor online presence, they're at risk.

This month, a report released by US analysis company Graphika found that in September, a jaw-dropping 24 million people worldwide visited so-called "undressing" websites — platforms where users can log on and de-clothe individuals using artificial intelligence (AI).

Graphika reported that now, for the first time, the creation and distribution of synthetic "non-consensual intimate imagery", known as NCII, had evolved from niche pockets of the internet into fully-scaled online businesses "that leverage a myriad of resources to monetise and market its services".

Nude deep-fakes skyrocket in popularity, report reveals

An image posted to X advertising the services of a synthetic NCII provider. The image suggests the NCII provider is marketing their services to users as a tool for harassment. Source: Graphika.
An image posted to X advertising the services of a synthetic NCII provider. The image suggests the NCII provider is marketing their services to users as a tool for harassment. Source: Graphika.

"Creators of synthetic NCII, also known as 'undressing' images, manipulate existing photos and video footage of real individuals to make them appear nude without their consent," the report deduced.

A total of 34 NCII providers were uncovered by Graphika.

Tens of millions access 'undressing' pics every month

It's an issue that disproportionally affects women. It's also raised questions over the impact of the distribution of child pornography, with so many children nowadays either having their own social media profiles or being pictured on the accounts of loved ones.

If you're on social media, it's already too late

Speaking to Yahoo News Australia, Dr Dana McKay, Senior Lecturer of Innovative Interactive Technologies at RMIT University, said realistically there's little people can do to protect themselves from becoming victims.

"Essentially, the average Facebook profile photo is probably enough to create reasonably plausible non-consensual images in this way," McKay told Yahoo.

"We know there's there's a lot of children using social media, too. By the time they're four, most kids have — there's some ridiculously large number — well over 100 images of them online.

A graph showing the volume of comments and posts on Reddit (orange) and X (blue) containing referral links to the websites of 72 synthetic NCII providers between January - September 2023.
The volume of comments and posts on Reddit (orange) and X (blue) containing referral links to the websites of 72 synthetic NCII providers between January - September 2023. Source: Meltwater.

"So these are images that those children aren't even thinking about."

While for some time AI porn has been increasing in popularity to cater to its similarly soaring demand, Graphika's study also revealed it's now generating serious income, with prices for "undressing" images ranging from A$3 to A$450.

'People will continue to pay' for artificial nudes, expert warns

McKay said that for as long as there's demand, the nudes will continue to be made — and paid for.

"People have paid for explicit images since they were in magazines," she noted. "This is not new, and the fact that people can have a super customised specific image to their tastes [means] they'll pay for it.

"Especially if the services are doing — I don't want to say a good job — but a job that's effective. I think the price will probably come down though, [as] companies start undercutting each other."

According to Graphika's report, NCII is rampant on social media platforms, in particular encrypted messaging services like Telegram. McKay warns they can be produced by pretty much anyone, with relative ease and distributed equally as simply.

A screenshot of the homepage of a synthetic NCII provider website and an account bio of a synthetic NCII provider on Instagram.
A screenshot of the homepage of a synthetic NCII provider website and an account bio of a synthetic NCII provider on Instagram. Source: Graphika.

Women disproportionally affected

McKay warned that the technology is a 'particular risk' for women. "While it's possible to make these images of men as well, the reason it's more problematic for women is because of the social consequences of having an image shared of you, as a woman, are much more severe."

"Simple economics dictates that men have more of a disposable income. So if you have to pay for it, more men are probably [going to] because the social consequences of having these types of images shared if you're a woman are way more severe."

Asked what individuals can do to prevent ending up in the realm of the deep-fake world, McKay said it's far too late for that. "Nothing obvious springs to mind. I mean, wear baggy clothes in pictures on the internet, because it makes it harder to make a realistic image of us that way," she said. "But, not really [anything] — apart from campaigning for legislative change.

"And I do think we need to be doing more of that. I think we need be making sure technology generally does not disproportionately benefit one versus another group, whether that's women, whether that's white people, whether that's straight people.

"We should be trying to ensure that the benefits of technology are shared more equitably than they are."

AI is 'in flux'

McKay said that AI is here to stay and "we just have to deal with it" and manage it properly and ethically. "We're in flux at the moment, there's always a disruptive period when new technologies take a revolutionary perspective, while the social norms are figured out around what the technology going to be used for."

Payment options for a synthetic NCII service on Telegram.
Payment options for a synthetic NCII service on Telegram. Source: Graphika.

Apart from the obvious harmful effects of deep-fake nudes, there's also a whole host of other problematic possibilities that have yet to emerge, including the impact on the adult entertainment industry, which McKay said "could definitely be threatened".

"Whatever a person's perspective is on this, it's definitely going to be a disruptive technology for the industry," she said.

According to Graphika, the volume of referral link spam — the posting of out-of-context links on websites — for NCII services has increased by more than 2,000 per cent on platforms including Reddit and X, formerly known as Twitter, since the beginning of 2023.

In addition, a set of 52 Telegram groups used to access NCII services contained at least one million users as of September this year, Graphika discovered.

Do you have a story tip? Email: newsroomau@yahoonews.com.

You can also follow us on Facebook, Instagram, TikTok, Twitter and YouTube.