It worries me that it's so normalised. He obviously wasn't hiding it. He didn't feel this was something he shouldn't be doing. It was in the open and people saw it. That's what was quite shocking." A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually pulled out his phone, selected a picture from social media of a girl at a neighbouring school and used a "nudifying" app to doctor her image.
Ten years ago it was sexting and nudes causing havoc in classrooms. Today, advances in artificial intelligence (AI) have made it child's play to generate deepfake nude images or videos, featuring what appear to be your friends, your classmates, even your teachers. This may involve removing clothes, getting an image to move suggestively or pasting someone's head on to a pornographic image.
The headteacher does not know why this particular girl -- a student at her school -- was selected, whether the boy knew her, or whether it was completely random. It only came to her attention because he was spotted by another of her pupils who realised what was happening and reported it to the school.
Drudge Retort Headlines
U.S. President Posts 160 Rants Last Night (174 comments)
White House Confirms Hegseth Ordered 2nd Strike (102 comments)
Backlash Growing Against Home Deport (87 comments)
How Fraud Swamped Minnesota's Social Services System on Tim Walz's Watch (64 comments)
Pregnant Mother Died After She Couldn't Get an Abortion in Texas (29 comments)
Trump Says He Will Pardon Ex-Honduras President (23 comments)
Trump Frees Fraudster Just Days Into 7-Year Sentence (22 comments)
Napolitano: Hegseth 'Should Be Prosecuted for a War Crime' (17 comments)
New Senate Bill Seeks to Outlaw Dual Citizenship (16 comments)
Ex-President Whom Trump Plans to Pardon Flooded America With Cocaine (15 comments)