Man Allegedly Filmed Kids at Disney World With GoPro to Make AI Child Porn Images

2 weeks ago 4

Man Filmed Kids at Disney World with GoPro to Make AI Child Porn Images

An alleged pedophile was arrested after he used a GoPro to film kids at Disney World in order to create thousands of AI child abuse images.

Justin Culmo was arrested in mid-2023 after he confessed to creating thousands of illegal images of real children that he filmed at Disney World in Florida, according to a report in Forbes.

According to Forbes, Culmo used a GoPro to record and victimize children who visited Disney World.

He then used AI image generator Stable Diffusion to turn thousands of photographs of children visiting the park into child sexual abuse material.

Culmo was arrested last year after spending over a decade as “one of about 20 high-priority targets” among global child exploitation detectives.

After his arrest, Forbes reports that Culmo confessed to recording kids at Disney World and using Stable Diffusion to turn the images into child porn.

Culmo was indicted in Florida for a range of child exploitation crimes. This includes allegations of abusing his two daughters, secretly filming minors, and distributing child sexual abuse imagery (CSAM) on the dark web, a section of the internet that isn’t visible or accessible to search engines.

The Ruthless Exploitation of AI

A jury trial has since been set for Culmo — who is reportedly pleading not guilty — in October. He has not been charged with AI CSAM production, which is also considered a crime under U.S. law.

“This is not just a gross violation of privacy, it’s a targeted attack on the safety of children in our communities,” Jim Cole, a former Department of Homeland Security agent who tracked the defendant’s online activities during his 25 years as a child exploitation investigator, tells the publication.

“This case starkly highlights the ruthless exploitation that AI can enable when wielded by someone with the intent to harm.”

The reported criminal activity stands as one of the most disturbing examples of AI image manipulation to date, potentially affecting numerous Disney World visitors. Despite this, Disney tells Forbes that law enforcement has not reached out regarding the alleged incidents at its park.

In May, a U.S. man was charged by the FBI for allegedly producing 13,000 sexually explicit and abusive AI images of children on the popular Stable Diffusion model.

Meanwhile, in March, two teen boys from Miami, Florida were arrested by police for allegedly making deepfake nude images of their high-school classmates — in what was believed to be the first-ever U.S. instance of criminal charges in relation to AI-generated nudes at the time.

An internet watchdog agency warned that the rise of AI-generated child sex abuse images online could get even worse — if controls aren’t put on the technology that generates deepfake photos.


 
Image credits: Header photo licensed via Depositphotos.
 

Read Entire Article