To Deliver on Targets of Halving VAWG in a Decade, the Government Must Ban AI Nudifying Apps

My experience from working in vulnerable women and children organisations over the last nine years is that, in many respects, we’ve moved backwards.

I’ve seen what progress we have made towards tackling violence against women and girls (VAWG) and safeguarding children forever outpaced by technological developments. In the last five years alone, police data shows a 37% increase in VAWG as our capacity to regulate dawdles behind rapid advances in tech.

The latest technological form of exploitation and abuse of children and young people comes through generative AI. Nude deepfakes, the term given to AI-generated sexually explicit images and videos that often feature real people, have surged. Since 2017, we’ve seen an increase of over 400% in deepfake sexual content online.

Earlier this year, the Online Safety Act made sharing nude deepfake images of adults without consent illegal. It is already illegal to create, share, and be in possession of a nude deepfake image of a child – but the tools and apps that generate these images remain legal, cheap, and easy to use and continue to cause harm.

The loophole is analogous to banning the possession of zombie knives but not their sale – and we must end it. The harm these ‘nudifying’ apps and tools are having on our children is only becoming more pronounced.

Recent estimates suggest that over half a million children have some experience with AI-generated, sexually-explicit images. Polling by online safety non-profit, Internet Matters, found a majority of teenagers feel that having a nude deepfake image created and shared of them would be worse than a real image being shared. Teenagers spoke about the loss of bodily autonomy they would feel if a deepfake image was made of them and the concern that they might not know it had been made, who had done it or why. They spoke about the fear it could cause; friends, teachers and parents thinking it was real and seeing them differently or the image completely misrepresenting them.

While a majority of both boys and girls think a nude deepfake of them would be worse than a real image, this is a problem that is disproportionately affecting girls. Many nudifying tools do not work on images of boys and men and a study by Security Hero found that 99% of nude deepfakes made are of women and girls. There have already been cases of breakouts of images of girls in schools in the UK and abroad and while the legal net has been closing around the activities related to nude deepfakes, this is not enough to stem the tide of harms.

It is imperative that we ban so-called nudifying apps. 84% of teenagers and 80% of parents agree that such tools should be banned for everyone in the UK, including adults. I urge the Government to heed such calls as we work to deliver the target of halving violence against women and girls within a decade.

 

If you enjoyed this piece, follow the link here to read another blog post on how we must tackle VAWG.