A New Zealand MP has taken a bold stand against the growing threat of deepfake pornography by holding up a censored, AI-generated nude image of herself in Parliament.
Laura McClure revealed she created the image in under five minutes using free online tools to demonstrate how easily such technology can be misused.
Speaking in the House, McClure said a basic Google search revealed hundreds of websites that allow users to “nudify” photos and videos — the vast majority targeting women.
“It gave me the ick having to stand in Parliament and hold up the photo of myself, even knowing that it’s not me,” she told MPs.
According to the Law Association, between 90% and 95% of deepfake pornography is non-consensual, and current online safeguards are failing to keep the technology out of dangerous hands.
McClure said she didn’t create the fake image for shock value, but to push for urgent legislative reform to prevent abuse.
In the UK, predators who produce or share sexually explicit deepfakes without consent now face up to two years in prison. But McClure says New Zealand still lags in regulation.
It’s unclear what shape a potential New Zealand law might take, but campaigners hope it will follow the UK’s example.
The UK’s legislation builds on existing protections, including the Voyeurism Act of 2018, which criminalises acts such as upskirting.
Data released this year underscores growing concerns among young people: 57% of under-18s surveyed fear becoming victims of deepfake pornography.
A separate report by cybersecurity firm ESET found that 33% of women said intimate images they shared were later misused.
Shockingly, 25% said they had been threatened with exposure online, while 28% had already had their images posted without consent.
McClure’s powerful message to lawmakers is clear: without urgent legal reform, deepfake abuse could become a global epidemic.