It typically starts with a downloaded photo from social media, transformed into an unconsented nude image through harmful deepfake apps. These images can now be swiftly shared across entire schools, leaving victims feeling humiliated and violated.
The crisis has grown significantly in recent years as the technology becomes more accessible, impacting over 600 pupils in at least 28 countries since 2023. This includes incidents in North America, South America, Europe, Australia, and East Asia.
Victims often suffer from severe distress and fear that the images will haunt them forever, with some even avoiding attending school or facing classmates who created these explicit images of them. Legal actions are underway to combat this pervasive issue.
The true scale is likely higher, with one survey estimating 1.2 million children had sexual deepfakes made of them last year. Schools and law enforcement often struggle to address such serious incidents effectively.







