Teens using AI to make fake nudes of classmates…
They’ve turned tech into a weapon — and no one’s secure from the scandal.
Teens are using artificial intelligence to whip up disturbingly reasonable nude photos of their classmates — after which share them like digital wildfire, sending shockwaves by colleges and leaving consultants fearing the worst.
The AI-powered instruments, usually dubbed “nudify” apps, are as sinister as they sound. With simply a headshot — usually lifted from a yearbook photograph or social media profile — these apps can fabricate specific deepfake photos that seem scarily actual.
And yes, it’s already occurring in colleges.
These hyper-realistic photos — cast with AI instruments — are turning bullying into a high-tech nightmare.
“We’re at a place now where you can be doing nothing and stories and pictures about you are posted online,” Don Austin, superintendent of the Palo Alto Unified School District, informed Fox News Digital.
“They’re fabricated. They’re completely made up through AI and it can have your voice or face. That’s a whole other world.”
This is a full-blown digital disaster. Last summer season, the San Francisco City Attorney’s workplace sued 16 so-called “nudify” web sites for allegedly violating legal guidelines round little one exploitation and nonconsensual photos.
Those websites alone racked up more than 200 million visits within the first half of 2023.
But catching the tech corporations behind these instruments? That’s like enjoying a sport of Whac-A-Mole.
Most have skated previous present state legal guidelines, although some — like Minnesota — try to move laws to maintain them accountable for the havoc they’re wreaking.
Still, the tech strikes sooner than the law — and youngsters are getting caught within the crossfire.
Josh Ochs, founder of SmartSocial — an group that trains households on online security — informed Fox News Digital that AI-generated nudes are inflicting “extreme harm” to teenagers throughout the nation.
“Kids these days will upload maybe a headshot of another kid at school and the app will recreate the body of the person as though they’re nude,” Ochs revealed to the outlet.
“This causes extreme harm to that kid that might be in the photo, and especially their friends as well and a whole family,” he famous.
He mentioned mother and father need to stop tiptoeing round their youngsters’s digital lives — and begin laying down some boundaries.
“Before you give your kids a phone or social media, it’s time to have that discussion early and often. Hey, this is a loaner for you, and I can take it back at any time because you could really hurt our family,” Ochs mentioned.
In February, the U.S. Senate unanimously handed a invoice to criminalize publishing — and even threatening to publish — nonconsensual AI deepfake porn.
It now awaits additional motion.
Austin mentioned the one manner to get forward of the curve is to keep speaking — with mother and father, academics, college students, and anybody else who will hear.
“This isn’t going away,” he warned. “It’s evolving — and fast.”
Stay within the loop with the most recent trending topics! Visit our web site each day for the freshest life-style information and content material, thoughtfully curated to encourage and inform you.



