Dawn
-
13:54 Feb 05, 2026
The United Nations Children’s Emergency Fund (Unicef) on Thursday said it was increasingly “alarmed” by reports of AI-generated sexualised images involving children, calling on governments and the AI industry to prevent the creation and dissemination of such content. In a statement, the UN agency said, “The harm from deepfake abuse is real and urgent. Children cannot wait for the law to catch up.” “Deepfakes — images, videos, or audio generated or manipulated with Artificial Intelligence designed to look real — are increasingly being used to produce sexualised content involving children through ‘nudification,’ where AI tools are used to strip or alter clothing in photos to create fabricated nude or sexualised images,” Unicef said in its statement. It added that the “unprecedented” situation presented new challenges for “prevention, education, legal frameworks, and response and support services for children”. But, current prevention efforts were “insufficient when sexual content can be artificially generated”,...