9 supporters

Anonymous
Somewhere
1 week ago · It's a normal thing that should happen in a normal word . No nudes should be allowed.
Kayode Sobande
Somewhere
1 week ago · I support the petition
Anonymous
Somewhere
2 weeks ago · I am signing because I believe it is important to protect children.
Anonymous
Somewhere
2 weeks ago · PETITION AGAINST THE CREATION OF NON-CONSENSUAL AI-GENERATED NUDE IMAGES I am deeply troubled by the reality that artificial intelligence can now be used to create fake nude images of real people without their consent. This fear is not hypothetical—it is already happening to countless women and girls, and it represents a disturbing violation of privacy, dignity, and personal safety. The use of AI to generate non-consensual nude and sexual images disproportionately affects women and girls, who are overwhelmingly the primary targets. This is not a distant or emerging threat; it is a present and growing crisis. Recent research estimates that four children in every class of 30 have experienced AI-generated nude deepfakes, and the most popular website dedicated to non-consensual deepfake content reportedly receives around 17 million visits each month. Across the UK, teenage girls are discovering that classmates have used easily accessible apps to convert their social media photos into explicit images and circulate them in private group chats. Even major online platforms, including X (formerly Twitter), have been found to allow their AI tools to generate non-consensual intimate images. AI technology has advanced at an alarming speed, enabling the creation of hyper-realistic explicit content in seconds. These images can destroy reputations, damage relationships, and cause severe psychological harm, including anxiety, depression, and trauma. This practice amounts to a form of digital sexual abuse—an invasion of personal autonomy that often leaves victims feeling exposed, powerless, and unsafe. Research by Deeptrace found that over 95% of deepfakes in 2019 were pornographic and almost exclusively focused on women. An estimated 99% of nude deepfakes feature girls and women, and so-called “nudifying” tools frequently do not work on images of boys or men. This highlights the deeply gendered nature of this abuse. Despite the seriousness of this issue, UK law currently contains a critical gap. While the sharing of deepfake pornography has been criminalised, the creation of such material is not. This means an individual can generate explicit images of someone without their consent and face no legal consequences, provided the images are not shared. However, the harm occurs at the moment of creation. The knowledge that someone could be creating such images at any time instils fear and insecurity into women’s daily lives.
Wilson Benson
NG
2 weeks ago · I support
Jeffery Ufeli
NG
2 weeks ago · From NG signed this petition at January 14, 2026
Bolaji Biodun
Somewhere
2 weeks ago · From Somewhere signed this petition at January 14, 2026
Remy Bankole
Somewhere
2 weeks ago · From Somewhere signed this petition at January 14, 2026
Justin Majek
Manchester, GB
2 weeks ago · From Manchester, GB signed this petition at January 14, 2026