Editorial illustration for Man convicted Take It Down Act used 24 AI tools to create non‑consensual nudes
Man convicted Take It Down Act used 24 AI tools to...
Man convicted Take It Down Act used 24 AI tools to create non‑consensual nudes
The first conviction under the Take It Down Act has drawn attention not just because a man was found guilty, but because the case exposes how readily a single device can become a workshop for illicit content. While the law aims to curb the spread of non‑consensual intimate images, prosecutors say the defendant—identified as Strahler—didn’t stop after his initial arrest. Instead, investigators say he kept building a digital arsenal, installing dozens of applications and dozens of models that generate realistic photographs.
The court filings suggest the effort wasn’t a one‑off experiment; they allege a systematic campaign that produced a massive volume of illegal imagery, targeting both adults and minors. What makes the case especially striking is the sheer scale of the tools involved, and the apparent intent to use them repeatedly despite legal warnings. The following excerpt from police findings lays out just how extensive his setup was, and why the conviction matters beyond a single charge.
Cops found that Strahler "installed more than 24 AI platforms and more than 100 AI web-based models on his phone," which he used to create hundreds, if not thousands, of non-consensual intimate images (NCII) depicting both women and children. Court documents showed that he created the images to try to coerce victims and their mothers into sending genuine nude images, while also threatening rape and "leaving voicemails of him masturbating." According to the Columbus Dispatch, Strahler made some of the unlawful images of his exes, their family, and their friends "to scare women into reconciling with him." Additionally, he posted more than 700 images depicting real and "animated" persons "to a website dedicated to child sexual abuse." Cops also found that he posted NCII of at least one victim and her mother on a website called "Motherless," which encourages users to post "anything legal." And Strahler also posed as a victim on a pornographic site, where he "provided AI-generated images and video to at least one person," The Columbus Dispatch reported.
James Strahler II’s conviction marks the first application of the Take It Down Act, a federal measure aimed at curbing non‑consensual intimate imagery. He pleaded guilty to producing and distributing both real and AI‑generated explicit images of at least ten victims, including women and children. The Justice Department reported that he installed more than 24 AI platforms and over 100 web‑based models on his phone, using them to generate hundreds, perhaps thousands, of such images.
In at least one instance he fabricated a scene of a victim with her father and sent it to her mother and coworkers, a detail that underscores the personal harassment involved. The court documents indicate the images were created to harass six women he knew. Whether this case will serve as a deterrent for similar misuse of AI tools remains uncertain.
Can this precedent curb future abuse? The conviction demonstrates that existing statutes can be extended to address AI‑facilitated abuse, but the breadth of enforcement under the new law is still being defined.