I continue to worry about the ease of producing and then disseminating deepfake imagery. Recent reporting from Wired and Indicator, as an example, noted that: The true scale of deepfake sexual abuse taking place in schools is likely much higher. One survey by United Nations children’s agency Unicef estimates that 1.2 million children had sexual deepfakes created of them last year. One in five young people in Spain told Save the Children researchers that deepfake nudes had been created of them. Child protection group Thorn found one in eight teens know someone targeted, and in 2024, 15 percent of students surveyed by the Center for Democracy and Technology said they knew about AI-generated deepfakes linked to their school. … In South Korea and Australia, schools have given pupils the option not to have their photos in yearbooks or stopped posting images of students on their official social media accounts, citing their use for potential deepfake abuse. “Around the world, there have been…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.