Deepfake crackdown should go further and remove 'nudify' apps, inquiry told

Survivors and advocacy groups have called for tougher penalties to respond to the rise of fake sexualised images and videos used to target women.

Young woman on phone

Deepfakes were often accompanied by doxing, where personal information was shared online. Source: Getty / globalmoments/iStockphoto

Creating and sharing fake, sexual images could soon be much harder as an online regulator targets problematic apps that mainly target women.

The generation of "deepfakes" is rising rapidly alongside technology powered by artificial intelligence (AI).

Deepfakes involve digitally altered images of a person or their body and AI can be used to generate an image based on a photo or superimpose faces onto pornographic material.

Several experts detailed the potential harms during a parliamentary hearing on Tuesday.

Nicole Lambert from the National Association of Services Against Sexual Violence said young people had taken their lives after becoming victims of deepfake material.

Call for tougher penalties for tech companies

Rachel Burgin, from Rape and Sexual Assault Research and Advocacy, highlighted a survey of perpetrators in which many respondents admitted the biggest deterrent to committing abuse would have been criminal penalties.

"What we're doing for prevention in Australia doesn't work, that's why we've had more than 50 women killed at the hands of men this year," she told the committee.
Deepfakes were often accompanied by doxing, where personal information was shared online.

This made people fear for their safety because sexual violence was a precursor to homicide, she said.

Abuse victim Noelle Martin chastised tech companies and social media platforms for failing to take down such material or search results, with billions of people accessing the top 40 non-consensual nude sites.

She called for severe fines and possible criminal liability.

Push for removal of 'nudify' apps

The eSafety Commissioner said she would welcome powers to enable her to take down apps that primarily exist to "nudify" women or create synthetic child sexual abuse material.

"Some might wonder why apps like this are allowed to exist at all, given their primary purpose is to sexualise, humiliate, demoralise, denigrate and create child sexual abuse material," Julie Inman Grant told the hearing.
"These apps make it simple and cost free for the perpetrator, while the cost to the target is one of lingering and incalculable devastation."

The use of deepfakes to control women in abusive relationships was also explored.

"People will and do create images of their partners as a mechanism of exerting control over them in a family violence context," Burgin said.

Women often targeted

More than 96 per cent of deepfakes targeted women, the inquiry heard.

A 2020 incident included more than 680,000 women having nude images of them generated and shared by an AI chatbot, law professor Rebecca Delfino said.

A cybersecurity company that tracked deepfake videos found 90 to 95 per cent were non-consensual porn, Delfino said.
LISTEN TO
'It was all just abuse': How pornography fuels violence image

'It was all just abuse': How pornography fuels violence

SBS News

18/05/202414:34
American singer-songwriter Taylor Swift became the most high-profile celebrity to be .

In the same month, Victorian Animal Justice Party MP Georgie Purcell had her that the network claims was "inadvertently altered by Photoshop".

Criminal penalties proposed

The Albanese government wants to criminalise the transmission of sexual material relating to adults without their consent.

The offences would capture unaltered material as well as content produced using 'deepfake' technology.

Its legislative changes should capture the creation of images and threats to produce such material, the committee heard.
Attorney-General Mark Dreyfus argued the Commonwealth had legal limits to what it could tackle but Marque Lawyers managing partner Michael Bradley believed the government had the power to broaden its bill.

"In the terrorism realm, there are some pretty broad offences that criminalise accessing and creating content so I don't think it's that much of a stretch," Bradley said.

The committee will report by 8 August.

If you or someone you know is impacted by sexual assault, call 1800RESPECT on 1800 737 732 or visit . In an emergency, call 000.

Readers seeking crisis support can contact Lifeline on 13 11 14, the Suicide Call Back Service on 1300 659 467 and Kids Helpline on 1800 55 1800 (for young people aged up to 25). More information and support with mental health is available at and on 1300 22 4636.

supports people from culturally and linguistically diverse backgrounds.

Share
4 min read
Published 24 July 2024 7:33am
Updated 24 July 2024 8:29am
Source: AAP, SBS



Share this with family and friends