Saturday, September 7, 2024
27 C
Brunei Town

Latest

Europol warns of uptick in AI-aided child abuse images

AFP – Artificial intelligence (AI)-linked images of child sex abuse are on the rise, Europe’s policing agency warned yesterday, saying the material makes it increasingly difficult to identify victims and perpetrators.

Criminals have been adopting AI tools and services to carry out a range of crimes from online fraud and cyberattacks, to creating explicit images of children, Europol said.

“Cases of AI-assisted and AI-generated child sexual abuse material have been reported,” the Hague-based agency said in a new report.

“The use of AI which allows child sex offenders to generate or alter child sex abuse material is set to further proliferate in the near future,” Europol added in a 37-page report, looking at current online threats facing Europe.

The production of artificial abuse images increases “the amount of illicit material in circulation and complicates the identification of victims as well as perpetrators,” Europol said.

More than 300 million children a year were victims of online sexual exploitation and abuse, researchers at the University of Edinburgh said in May.

Offences ranged from so-called sextortion, where predators demand money from victims to keep images private, to the abuse of AI technology to create deepfake videos and pictures, the university’s Childlight Global Safety Institute said.

The advent of AI has caused growing concern around the world that the technology can be used for malicious purposes such as the creation of so-called “deepfakes” – computer-generated, often realistic images and video, based on a real template.

“The volume of self-generated sexual material now constitutes a significant and growing part of child abuse sexual material online,” Europol said.

“Even in the cases when the content is fully artificial and there is no real victim depicted, AI-generated child sex abuse material still contributes to the objectification and sexualisation of children,” Europol said.

PHOTO: ENVATO
spot_img

Related News

spot_img