Two teen boys have been arrested for allegedly making deepfake nude images of their classmates — in what is believed to be the first-ever U.S. instance of criminal charges in relation to AI-generated nudes. According to a report by WIRED, the high-school students from Miami, Florida, were arrested and charged with third-degree felonies in December for allegedly creating and sharing AI-generated nude images of male and female classmates without consent. According to a police report, the two boys, aged 13 and 14, are accused of using an unnamed “artificial intelligence application” to generate the explicit images of other female and male students “between the ages of 12 and 13.” The reports claim the boys shared the nonconsensual deepfake nudes between each other and the incident was then reported to the high school administrator who “obtained copies of the altered images.” According to WIRED, local media reports that the pair of students at Pinecrest Cove Academy in Miami, Florida were arrested on December 22 after the case was reported to the Miami-Dade Police Department. The pair were charged with third-degree felonies — the same level of crimes as grand theft auto or false imprisonment — under a 2022 Florida law that criminalizes the dissemination of “any altered sexual depiction” of a person without the victims’ consent.
The two boys accused of making the images were transported to the Juvenile Service Department “without incident.” The First-Ever Arrest in The US The case appears to be the first-ever arrest and criminal charges of its kind in the U.S. relating to the sharing of AI-generated deepfake nude images. In recent months, there have been several incidents of high-school students using widely-available AI apps to make deepfake nude photographs of their female classmates — including in schools in New Jersey, Seattle, and Los Angeles. However, in these previous cases, the lack of clarity on such AI-generated images’ legality and how or whether to punish the creators of the deepfaked photos had parents, schools, and law enforcement struggling to deal with the technology. In the U.S., there is currently no federal legislation to protect against people’s images being used without their consent in deepfake porn or with any associated technology. This has left U.S. states tackling the impact of generative AI on matters of child sexual abuse material, nonconsensual deepfakes, or revenge porn on their own. This incident in Florida could be a landmark case regarding criminal charges in relation to the sharing of nonconsensual AI-generated porn in the U.S. Image credits: Header photo licensed via Depositphotos.
We will be happy to hear your thoughts