There’s a lot of debate about the role of technology in kids’ lives, but sometimes we come across something unequivocally bad. That’s the case with AI “nudification” apps, which teenagers are using to generate and share fake naked photos of their classmates.
Now advocates — including some teens — are backing laws that impose penalties for creating and sharing deepfake nudes. Legislation has passed in Washington, South Dakota, and Louisiana, and is in the works in California and elsewhere. Meanwhile, Rep. Joseph Morelle (D-NY) has reintroduced a bill that would make sharing the images a federal crime.
Francesca Mani, a 15-year-old Westfield student whose deepfaked image was shared, started pushing for legislative and policy change after she saw her male classmates making fun of girls over the images. “I got super angry, and, like, enough was enough,” she told Vox in an email sent via her mother. “I stopped crying and decided to stand up for myself.”
Supporters say the laws are necessary to keep students safe. But some experts who study technology and sexual abuse argue that they’re likely to be insufficient, since the criminal justice system has been so inefficient at rooting out other sex crimes. “It just feels like it’s going to be a symbolic gesture,” said Amy Hasinoff, a communications professor at the University of Colorado Denver who has studied image-based sexual abuse.
She and others recommend tighter regulation of the apps themselves so the tools people use to make deepfake nudes are less accessible in the first place. “I am struggling to imagine a reason why these apps should exist’’ without some form of consent verification, Hasinoff said.
Deepfake nudes are a new kind of sexual abuse
So-called revenge porn — nude photos or videos shared without consent — has been a problem for years. But with deepfake technology, “anybody can just put a face into this app and get an image of somebody — friends, classmates, coworkers, whomever — completely without clothes,” said Britt Paris, an assistant professor of library and information science at Rutgers who has studied deepfakes.
There’s no hard data on how many American high school students have experienced deepfake nude abuse, but one 2021 study conducted in the UK, New Zealand, and Australia found that 14 percent of respondents ages 16 to 64 had been victimized with deepfake imagery.
Nude images shared without consent can be traumatic, whether they’re real or not. When she first found out about the deepfakes at her school, “I was in the counselor’s office, emotional and crying,” Mani said. “I couldn’t believe I was one of the victims.”
When sexual images of students are shared around school, they can experience “shaming and blaming and stigmatization,” thanks to stereotypes that denigrate girls and women, especially, for being or appearing to be sexually active, Hasinoff said. That’s the case even if the images are fake because other students may not be able to tell the difference.
Moreover, fake images can follow people throughout their lives, causing real harm. “These images put these young women at risk of being barred from future employment opportunities and also make them vulnerable to physical violence if they are recognized,” Yeshi Milner, founder of the nonprofit Data for Black Lives, told Vox in an email.
Stopping deepfake abuse may require reckoning with AI
To combat the problem, at least nine states have passed or updated laws targeting deepfake nude images in some way, and many others are considering them. In Louisiana, for example, anyone who creates or distributes deepfakes of minors can be sentenced to five or more years in prison. Washington’s new law, which takes effect in June, treats a first offense as a misdemeanor.
The federal bill, first introduced in 2023, would give victims or parents the ability to sue perpetrators for damages, in addition to imposing criminal penalties. It has not yet received a vote in Congress but has attracted bipartisan support.
However, some experts worry that the laws, while potentially helpful as a statement of values, won’t do much to fix the problem. “We don’t have a legal system that can handle sexual abuse,” Hasinoff said, noting that only a small percentage of people who commit sexual violence are ever charged. “There’s no reason to think that this image-based abuse stuff is any different.”
Some states have tried to address the problem by updating their existing laws on child sexual abuse images and videos to include deepfakes. While this might not eliminate the images, it would close some loopholes. (In one recent New Jersey lawsuit, lawyers for a male high school student argued he should not be barred from sharing deepfaked photos of a classmate because federal laws were not designed to apply “to computer-generated synthetic images.”)
Meanwhile, some lawyers and legal scholars say that the way to really stop deepfake abuse is to target the apps that make it possible. Lawmakers could regulate app stores to bar them from carrying nudification apps without clear consent provisions, Hasinoff said. Apple and Google have already removed several apps that offered deepfake nudes from the App Store and Google Play.
However, users don’t need a specific app to make nonconsensual nude images; many AI image generators could potentially be used in this way. Legislators could require developers to put guardrails in place to make it harder for users to generate nonconsensual nude images, Paris said. But that would require challenging the “unchecked ethos” of AI today, in which developers are allowed to release products to the public first and figure out the consequences later, she said.
“Until companies can be held accountable for the types of harms they produce,” Paris said, “I don’t see a whole lot changing.”
rn rn vox-markrn rn rn rn rn rn“,”cross_community”:false,”internal_groups”:[{“base_type”:”EntryGroup”,”id”:112405,”timestamp”:1715081404,”title”:”Approach — Explores solutions or ideas to solve problems”,”type”:”SiteGroup”,”url”:””,”slug”:”approach-explores-solutions-or-ideas-to-solve-problems”,”community_logo”:”rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:295,”always_show”:false,”description”:””,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”}],”image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73322071/GettyImages_1357723814.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”Nude images shared without consent can be traumatic, whether they’re real or not.”,”credit”:”Getty Images/iStockphoto”,”focal_area”:{“top_left_x”:826,”top_left_y”:699,”bottom_right_x”:1618,”bottom_right_y”:1491},”bounds”:[0,0,4961,3508],”uploaded_size”:{“width”:4961,”height”:3508},”focal_point”:null,”image_id”:73322071,”alt_text”:”Illustration of a woman sitting in a corner, looking at a smartphone.”},”hub_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73322071/GettyImages_1357723814.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”Nude images shared without consent can be traumatic, whether they’re real or not.”,”credit”:”Getty Images/iStockphoto”,”focal_area”:{“top_left_x”:826,”top_left_y”:699,”bottom_right_x”:1618,”bottom_right_y”:1491},”bounds”:[0,0,4961,3508],”uploaded_size”:{“width”:4961,”height”:3508},”focal_point”:null,”image_id”:73322071,”alt_text”:”Illustration of a woman sitting in a corner, looking at a smartphone.”},”lede_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”Nude images shared without consent can be traumatic, whether they’re real or not.”,”credit”:”Getty Images/iStockphoto”,”focal_area”:{“top_left_x”:826,”top_left_y”:699,”bottom_right_x”:1618,”bottom_right_y”:1491},”bounds”:[0,0,4961,3508],”uploaded_size”:{“width”:4961,”height”:3508},”focal_point”:null,”image_id”:73322072,”alt_text”:”Illustration of a woman sitting in a corner, looking at a smartphone.”},”group_cover_image”:null,”picture_standard_lead_image”:{“ratio”:”*”,”original_url”:”https://cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg”,”network”:”unison”,”bgcolor”:”white”,”pinterest_enabled”:false,”caption”:”Nude images shared without consent can be traumatic, whether they’re real or not.”,”credit”:”Getty Images/iStockphoto”,”focal_area”:{“top_left_x”:826,”top_left_y”:699,”bottom_right_x”:1618,”bottom_right_y”:1491},”bounds”:[0,0,4961,3508],”uploaded_size”:{“width”:4961,”height”:3508},”focal_point”:null,”image_id”:73322072,”alt_text”:”Illustration of a woman sitting in a corner, looking at a smartphone.”,”picture_element”:{“loading”:”eager”,”html”:{},”alt”:”Illustration of a woman sitting in a corner, looking at a smartphone.”,”default”:{“srcset”:”https://cdn.vox-cdn.com/thumbor/Q9nN7NKGa0DJWyMFcAK-kgIPEvI=/0x0:4961×3508/320×240/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/d9qz_TQwsHtbgjAYiUK_qKcUr6A=/0x0:4961×3508/620×465/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/3NmolWU80xtbU7P9Faav4qmZ5KI=/0x0:4961×3508/920×690/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/S98C_ZDFxy9QPfkytasreaFxK6E=/0x0:4961×3508/1220×915/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/EdS8HVpqFQ7lcJPXJZZuVVrXTbA=/0x0:4961×3508/1520×1140/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 1520w”,”webp_srcset”:”https://cdn.vox-cdn.com/thumbor/uebx8JeyMhPW2EaHY2fdEdnBmZY=/0x0:4961×3508/320×240/filters:focal(826×699:1618×1491):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 320w, https://cdn.vox-cdn.com/thumbor/Cczr987soWy78S3qT7Zpcky3ozU=/0x0:4961×3508/620×465/filters:focal(826×699:1618×1491):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 620w, https://cdn.vox-cdn.com/thumbor/Rb26omp5ltr-MWjcveiHhxLtPyE=/0x0:4961×3508/920×690/filters:focal(826×699:1618×1491):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 920w, https://cdn.vox-cdn.com/thumbor/fhk71jYPEZZC5Msju-kRUTkFFYI=/0x0:4961×3508/1220×915/filters:focal(826×699:1618×1491):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 1220w, https://cdn.vox-cdn.com/thumbor/ha8v43mHzw5omtYnVfRopEqTg-Y=/0x0:4961×3508/1520×1140/filters:focal(826×699:1618×1491):format(webp)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg 1520w”,”media”:null,”sizes”:”(min-width: 809px) 485px, (min-width: 600px) 60vw, 100vw”,”fallback”:”https://cdn.vox-cdn.com/thumbor/x4OV85I2SN1FYwqpH0KD4wS_gmY=/0x0:4961×3508/1200×900/filters:focal(826×699:1618×1491)/cdn.vox-cdn.com/uploads/chorus_image/image/73322072/GettyImages_1357723814.0.jpg”},”art_directed”:[]}},”image_is_placeholder”:false,”image_is_hidden”:false,”network”:”vox”,”omits_labels”:false,”optimizable”:false,”promo_headline”:”AI has created a new form of sexual abuse”,”recommended_count”:0,”recs_enabled”:false,”slug”:”24145522/ai-deepfake-apps-teens-ban-laws”,”dek”:”How do you stop deepfake nudes? “,”homepage_title”:”AI has created a new form of sexual abuse”,”homepage_description”:”How do you stop deepfake nudes?”,”show_homepage_description”:false,”title_display”:”AI has created a new form of sexual abuse”,”pull_quote”:null,”voxcreative”:false,”show_entry_time”:true,”show_dates”:true,”paywalled_content”:false,”paywalled_content_box_logo_url”:””,”paywalled_content_page_logo_url”:””,”paywalled_content_main_url”:””,”article_footer_body”:”We believe that everyone deserves to understand the world that they live in. That kind of knowledge helps create better citizens, neighbors, friends, parents, and stewards of this planet. Producing deeply researched, explanatory journalism takes resources. You can support this mission by making a financial gift to Vox today. Will you join us? rn”,”article_footer_header”:”Will you support Vox today?“,”use_article_footer”:true,”article_footer_cta_annual_plans”:”{rn “default_plan”: 1,rn “plans”: [rn {rn “amount”: 50,rn “plan_id”: 99546rn },rn {rn “amount”: 100,rn “plan_id”: 99547rn },rn {rn “amount”: 150,rn “plan_id”: 99548rn },rn {rn “amount”: 200,rn “plan_id”: 99549rn }rn ]rn}”,”article_footer_cta_button_annual_copy”:”year”,”article_footer_cta_button_copy”:”Yes, I’ll give”,”article_footer_cta_button_monthly_copy”:”month”,”article_footer_cta_default_frequency”:”monthly”,”article_footer_cta_monthly_plans”:”{rn “default_plan”: 0,rn “plans”: [rn {rn “amount”: 5,rn “plan_id”: 99543rn },rn {rn “amount”: 10,rn “plan_id”: 99544rn },rn {rn “amount”: 25,rn “plan_id”: 99545rn },rn {rn “amount”: 50,rn “plan_id”: 46947rn }rn ]rn}”,”article_footer_cta_once_plans”:”{rn “default_plan”: 0,rn “plans”: [rn {rn “amount”: 20,rn “plan_id”: 69278rn },rn {rn “amount”: 50,rn “plan_id”: 48880rn },rn {rn “amount”: 100,rn “plan_id”: 46607rn },rn {rn “amount”: 250,rn “plan_id”: 46946rn }rn ]rn}”,”use_article_footer_cta_read_counter”:true,”use_article_footer_cta”:true,”groups”:[{“base_type”:”EntryGroup”,”id”:116641,”timestamp”:1715076006,”title”:”Today, Explained newsletter”,”type”:”SiteGroup”,”url”:”https://www.vox.com/today-explained-newsletter”,”slug”:”today-explained-newsletter”,”community_logo”:”rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:65,”always_show”:false,”description”:”Understand the world with a daily explainer plus the most compelling stories of the day. Sign up for our newsletter and listen to our podcast.”,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:true},{“base_type”:”EntryGroup”,”id”:27524,”timestamp”:1715081404,”title”:”Technology”,”type”:”SiteGroup”,”url”:”https://www.vox.com/technology”,”slug”:”technology”,”community_logo”:”rn“,”community_name”:”Vox”,”community_url”:”https://www.vox.com/”,”cross_community”:false,”entry_count”:24627,”always_show”:false,”description”:”Uncovering and explaining how our digital world is changing — and changing us.”,”disclosure”:””,”cover_image_url”:””,”cover_image”:null,”title_image_url”:””,”intro_image”:null,”four_up_see_more_text”:”View All”,”primary”:false}],”featured_placeable”:false,”video_placeable”:false,”disclaimer”:null,”volume_placement”:”lede”,”video_autoplay”:false,”youtube_url”:”http://bit.ly/voxyoutube”,”facebook_video_url”:””,”play_in_modal”:true,”user_preferences_for_privacy_enabled”:false,”show_branded_logos”:true}”>