Artificial intelligence opens the door to a worrying trend of people creating realistic images of children in sexual contexts, which could increase the number of real-life cases of child sex crimes, experts warn.
AI platforms that can mimic human conversation or create realistic images exploded in popularity late last year in 2023 after the release of the ChatGPT chatbot, which served as a watershed moment for the use of artificial intelligence. As the curiosity of people around the world was piqued by technology for work or school tasks, others have embraced the platforms for more nefarious purposes.
The National Crime Agency, which is Britain’s main agency for tackling organized crime, warned this week that the proliferation of explicit machine-generated images of children is having a “radicalizing” effect by “normalising” pedophilia and disrupting attitudes towards children.
“We believe viewing these images – whether real or AI-generated – significantly increases the risk that offenders will themselves sexually abuse children,” NCA Director General Graeme Biggar said. , in a recent report.
AI ‘DEEPFAKES’ OF INNOCENT IMAGES FEEDING FUEL SPIKE IN SEXTORTION SCAMS, FBI WARNS
Graeme Biggar, Chief Executive of the National Crime Agency (NCA), during a meeting of the Police Council of Northern Ireland at James House, Belfast. Picture date: Thursday June 1, 2023. (Photo by Liam McBurney/PA Images via Getty Images) (Getty Images)
The agency estimates that there are up to 830,000 adults, or 1.6% of the adult population in the UK, who present some type of sexual danger against children. The estimated figure is ten times the UK prison population, according to Biggar.
The majority of child sexual abuse cases involve the viewing of explicit images, Biggar says, and with the help of AI, the creation and viewing of sexual images could ‘normalize’ child abuse. children in the real world.
ARTIFICIAL INTELLIGENCE CAN DETECT ‘SEXTORTION’ BEFORE IT HAPPENS AND HELP FBI: EXPERT
“[The estimated figures] partly reflect a better understanding of a threat that has historically been underestimated, and partly a genuine increase caused by the radicalizing effect of the internet, where the widespread availability of videos and images of abused and raped children, and of groups sharing and discussing the images, has normalized such behavior,” Biggar said.

Illustrations of artificial intelligence are seen on a laptop computer with books in the background in this July 18, 2023 illustration photo. (Photo by Jaap Arriens/NurPhoto via Getty Images) (Getty Images)
In the United States, a similar explosion of using AI to create sexual images of children is taking place.
“Images of children, including content from known victims, are being repurposed for this truly diabolical output,” Rebecca Portnoff, director of data science at child-protecting nonprofit Thorn, told The Daily Mail. Washington Post last month.
CANADIAN SENTENCED TO JAIL FOR AI-GENERATED CHILD PORNOGRAPHY: REPORT
“Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in danger,” she said. “The ease of use of these tools is a significant change, as well as the realism. It just makes it harder.”
Popular AI sites that can create images based on simple prompts often have community guidelines preventing the creation of disturbing photos.

Teenage girl in a dark room. (Getty Images)
These platforms are trained on millions of images from across the internet that serve as building blocks for AI to create compelling representations of people or places that don’t actually exist.
LAWYERS PREPARE FOR AI’S POTENTIAL TO OVERTURN COURT CASES WITH FALSE EVIDENCE
Midjourney, for example, calls for PG-13 content that avoids “nudity, sexual organs, fixation on bare breasts, people in showers or toilets, sexual images, fetishes.” While DALL-E, OpenAI’s image creation platform, only allows G-rated content, prohibiting images that show “nudity, sexual acts, sexual services, or content otherwise intended to arouse sexual arousal”. However, according to various reports on AI and sex crimes, dark web forums of bad guys are discussing workarounds to create disturbing images.

Police car with 911 sign. (Getty Images)
Biggar noted that the AI-generated images of children also throw police and law enforcement into a maze of deciphering fake images from those of real victims who need help.
“Using AI for child sexual abuse will make it harder for us to identify real children who need protection and further normalize abuse,” the NCA’s chief executive said.
AI-generated images can also be used in sextortion scams, with the FBI issuing a warning about the crimes last month.
Deepfakes often involve editing videos or photos of people to make them look like someone else using deep learning AI, and have been used to harass victims or collect money, including children.
FBI WARNS AI DEEPFAKES ARE USED TO CREATE ‘SEXTORTION’ PLANS
“Malicious actors use content manipulation technologies and services to exploit photos and videos – typically captured from an individual’s social media account, the open Internet, or requested from the victim – into sexually themed images that appear realistic in a victim’s image, then disseminate them on social media, public forums or pornographic websites,” the FBI said in June.
CLICK HERE TO GET THE FOX NEWS APP
“Many victims, including minors, are unaware that their images have been copied, manipulated and disseminated until someone else brings them to their attention.”