Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

pat_k

(13,482 posts)
2. Yikes! There is even a new acronym: non-consensual intimate imagery (NCII)
Thu Apr 16, 2026, 02:39 AM
Yesterday

Both users and creators of these "tools" MUST be held accountable.

Time to find model laws that that actually work. I don't know enough about the Take it Down Act to judge whether it's can actually address this shit.

Jane Doe v. AI/Robotics Venture Strategy 3 LTD, a lawsuit against ClothOff is described here:
https://law.yale.edu/yls-today/news/clinics-file-suit-against-website-generates-nonconsensual-nude-images

It appears to be running into significant obstacles on basics like service of process and international jurisdiction.

The SF City attorney has shutdown at least 10 sites:
https://sfcityattorney.org/city-attorney-shuts-down-10-websites-that-create-nonconsensual-deepfake-pornography/

So there is some progress, but it seems to be waaaayyyy behind.

Recommendations

1 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»The Deepfake Nudes Crisis...»Reply #2