Italy prime minister Giorgia Meloni seeks €100,000 damages over deepfake pornographic videos
Italy’s prime minister is seeking €100,000 (£85,374) in damages after deepfake pornographic videos of her were shared on the internet.
‘Deepfakes’ are images or videos where the face of one person, in this case Giorgia Meloni, is digitally put onto the body of another.
Two men accused of making the videos, a father and son aged 74 and 40, are being investigated.
Detectives working on the case were able to track down the mobile device that was used to upload the videos.
The two suspects are accused of defamation, which can carry a custodial sentence under Italian law.
The videos in question, which date back to 2022 - before Ms Meloni was appointed prime minister - were posted to a US pornographic website where they were viewed “millions of times” over several months, according to an indictment. Ms Meloni is now due to testify at a court in the Sardinian city of Sassari on 2 July as part of her case.
If her claim is successful, Ms Meloni will donate the €100,000 to a fund to support women who have been victims of male violence, her legal team said.
Describing the sum as “symbolic”, Maria Giulia Marongiu, Ms Meloni’s lawyer, said the demand for compensation was meant to "send a message to women who are victims of this kind of abuse of power not to be afraid to press charges."
Advances in artificial intelligence mean that deepfakes have not only become more commonplace place but also increasingly realistic. A number of celebrities have also been targeted by those who make deepfake videos and images.
US popstar Taylor Swift was perhaps the most high-profile person to be affected by deepfake porn when sexually explicit images of her were shared on social and chatrooms.
US media reports said some posts sharing the images, showing the popstar in sexually suggestive and explicit positions, amassed more than 27 million views and 260,000 likes in 19 hours, before the account that posted the images was suspended.
The offensive images and the length of time it took to have them removed spawned fury from Swift’s fans and others who have expressed alarm at the “violently misogynist” nature of the pictures.
Many have sounded alarm bells about how deepfakes can mislead members of the public, and previous research conducted by cybersecurity firm Deeptrace indicated around 96 per cent of all deepfake videos are non-consensual pornography, while women are targets in 96 per cent of cases.
It is not just celebrities and other public figures that have been targeted. In 2022 a woman spoke out after discovering deepfakes of herself online. Kate Isaacs, who campaigns to have non-consensual content removed from the internet, told The Independent it was “violating”.
The sharing of deepfake images was made a criminal offence in the UK in 2023.