New laws for Aussies caught in 'shocking' online nude trend

The attorney general introduced laws to parliament this week that criminalises sharing deep-fake pornographic images, with state leaders urged to follow suit.

Promotional Undressing AI images shown revealing woman's bodies.
Australians caught creating or distributing deep-fake nude imagery will now face a prison term if convicted. Source: UndressingAI

New laws that'll see offenders jailed if convicted of sharing or creating deep-fake nudes are a major "step in the right direction", one of the country's leading voices for violence against women said, but it's "shocking" it's taken the government this long to take strict action.

Attorney General Mark Dreyfus introduced laws to parliament today criminalising the sharing of artificially created pornographic images, or genuine adult images distributed without consent, with offenders now facing a prison sentence of six years. Those found to have created the images themselves could be jailed for seven.

Before today, only Victoria had such laws in place to combat what's now a nationwide criminal offence.

Dr Asher Flynn, Chief Investigator at the Australian Research Council Centre for the Elimination of Violence Against Women — the first centre of its kind in Australia — told Yahoo News Australia this new legislation will send a strong message to offenders, and she called on state leaders to follow the federal government's lead.

An image posted to X advertising the services of an AI 'nudify' app.
An image posted to X advertising the services of an AI 'nudify' app. Source: Graphika.

"So, the non-consensual creation of images, or deep-fakes, can be anything from images, videos, screenshots, or different types of recordings that have been created using AI tools, or even things like Photoshop, to depict someone in a nude or sexual way," she said.

"What the federal government are doing by introducing this type of legislation is really sending a message to the states and territories, that they also need to be implementing similar types of laws."

Flynn said it'll be up to the "federal jurisdiction to do the prosecuting and policing", so that's why there is "a real need that the states to look at updating their laws, too".

Attorney General Mark Dreyfus, who laws to parliament on Wednesday criminalising the sharing of artificially created pornographic images, or genuine adult images distributed without consent.
Attorney General Mark Dreyfus introduced laws to parliament on Wednesday criminalising the sharing of artificially created pornographic images, or genuine adult images distributed without consent. Source: Getty

Virtually anybody with an online presence could fall victim to deep-fake porn, she warned, which evidence states disproportionally affects women. It's important that as a nation, we address the root of the issue rather than encouraging people to limit their social media use in fear, Flynn added.

"Most of the research that's been done in this space to date has found that the majority of sexualised deep-fake content is of women," she said. "I think it's around 90 per cent, according to data from Sensitive AI, who've been monitoring online videos since 2018."

"Many of the 'nudify' apps that have been created only work on images of women, but having said that, we've also done research [that revealed] other groups that were most impacted were within our marginalised communities," she continued.

"So for example, members of the LGBTQI+ community were experiencing higher rates than people who didn't identify as a member of that community. Also young adults, Indigenous Australians and people with a disability."

An AI 'undressing platform' online, showing a woman clothed and unclothed, as new laws are ushered in banning non-consensual AI pornography.
Undressing apps are rampant online, with virtually anybody a potential target, experts say. Source: Undressing AI

Flynn theorised this may be in part because "online spaces have become quite an important site" for groups from "minoritised and marginalised communities" to be able to "engage in self-expression". While she welcomed the new legislation, she said she wished it went further.

"For me, it would have been really important to see the 'creation' offence itself as a standalone offence. So at the moment, the way that it looks like the government are introducing it is that they have the non-consensual sharing of the image and then if you also created it, you get an aggravated sentence," she explained.

"So you get a higher sentence. Whereas, I would have liked to have seen two separate offences introduced like we have in Victoria. It'd be good to see more onus put on the websites, the digital platform providers and the technology developers who are creating these types of tools too."

A screenshot of the homepage of a nude deep-fake provider and an account bio offering to 'remove clothes' from every photo on Instagram.
A screenshot of the homepage of a nude deep-fake provider and an account bio offering to 'remove clothes' from every photo on Instagram. Source: Graphika.

Reflecting on the widely-circulated deep-fake images of Taylor Swift which emerged earlier this year, Flynn said celebrities and people with high profiles are "always going to be targets", but "if she can't prevent it from happening, what hope do the rest of us have?"

"I think she's an easy target for people to harm," she said. "I think that introducing these types of laws and having education around it, raising awareness provides an opportunity to address some of the harms that are happening to people — celebrity or otherwise."

Here popstar Ariana Grande and actress Scarlett Johansson have been manipulated using the face swapping technology to create deepfake videos.
These fake non-consensual celebrity porn videos now proliferate the web. Here popstar Ariana Grande and actress Scarlett Johansson have been manipulated using the face swapping technology to create deepfake videos. Source: Supplied

One positive impact we're likely to see as a result of the legislative changes is that there'll probably be a sharp reduction in how accessible these deep-fake tools are. Tools that are, at the moment, readily available for anybody seeking them out, which evidence suggests is largely men.

"We really need to be ramping up our efforts in prevention education with young people through like promoting respectful relationships. And that that includes our digital society like our online worlds because you can't to separate those from offline worlds anymore," Flynn said.

The eSafety Commissioner victim's reporting portal, where anyone experiencing sexualised, deep-fake or any kind of image-based abuse, is available here.

If you or someone you know is impacted by domestic violence, find help by calling 1800RESPECT on 1800 737 732, or contact Lifeline on 13 11 14.

Do you have a story tip? Email: newsroomau@yahoonews.com.

You can also follow us on Facebook, Instagram, TikTok, Twitter and YouTube.