Could AI-Generated Porn Help Protect Children?

Now that generative AI models can produce photorealistic, fake images of child sexual abuse, regulators and child safety advocates are worried that an already-abhorrent practice will spiral further out of control. But lost in this fear is an uncomfortable possibility—that AI-generated child sexual material could actually benefit society in the long run by providing a less harmful alternative to the already-massive market for images of child sexual abuse.

The growing consensus among scientists is that pedophilia is biological in nature, and that keeping pedophilic urges at bay can be incredibly difficult. “What turns us on sexually, we don’t decide that—we discover that,” said psychiatrist Dr. Fred Berlin, director of the Johns Hopkins Sex and Gender Clinic and an expert on paraphilic disorders. “It’s not because [pedophiles have] chosen to have these kinds of urges or attractions. They’ve discovered through no fault of their own that this is the nature of what they’re afflicted with in terms of their own sexual makeup … We’re talking about not giving into a craving, a craving that is rooted in biology, not unlike somebody who’s having a craving for heroin.”

Ideally, psychiatrists would develop a method to cure viewers of child pornography of their inclination to view it. But short of that, replacing the market for child pornography with simulated imagery may be a useful stopgap.

There is good reason to see AI-generated imagery as the latest negative development in the fight against child sexual abuse. Regulators and law enforcement already comb through an enormous amount of images every day attempting to identify victims, according to a recent paper by the Stanford Internet Observatory and Thorn. As AI-generated images enter the sphere, it becomes harder to discern which images include real victims in need of help. Plus, AI-generated images rely on the likenesses of real people or real children as a starting point, which, if the images retain those likenesses, is abuse of a different nature. (That said, AI does not inherently need to train on actual child porn to develop a simulated version of it, but can instead combine training on adult pornography with its training on the likenesses of children.)

Finding a practical method of discerning which images are real, which images are of real people put into fake circumstances, and which images are fake altogether is easier said than done. The Thorn report claims that within a year it will become significantly easier for AI to generate images that are essentially indistinguishable from real images. But this could also be an area where AI might play a role in solving a problem it has created. AI can be used to distinguish between different forms of content, thereby aiding law enforcement, according to Rebecca Portnoff, head of data science at Thorn. For example, regulators could require AI companies to embed watermarks in open-source generated image files, or law enforcement could use existing passive detection mechanisms to track the origin of image files.

When it comes to the generated images themselves, not everyone agrees that satisfying pedophilic urges in the first place can stem them in the long run.

“Child porn pours gas on a fire,” said Anna Salter, a psychologist who specializes in the profiles of high-risk offenders. In Salter’s and other specialists’ view, continued exposure can reinforce existing attractions by legitimizing them, essentially whetting viewers’ appetites, which some offenders have indicated is the case. And even without that outcome, many believe that viewing simulated immoral acts harms the actor’s own moral character, and thus perhaps the moral fabric of society as well. From that perspective, any inappropriate viewing of children is an inherent evil, regardless of whether a specific child is harmed. On top of that, the potential normalization of those viewings can be considered a harm to all children.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

There is also the practical concern that, while viewers of AI-generated child pornography are not contributing to the re-victimization of an abused child, generated images will not stem the abuse itself. That’s because the makers of child pornography are typically child abusers, who will not refrain from abusing children because of changing demand for the images they collect of the abuse they inflict.

Still, satisfying pedophilic urges without involving a real child is obviously an improvement over satisfying them based on a real child’s image. While the research is inconclusive, some pedophiles have revealed that they rely on pornography to redirect their urges and find an outlet that does not involve physically harming a child—suggesting that, for those individuals, AI-generated child pornography actually could stem behavior that would hurt a real child.

As a result, some clinicians and researchers have suggested that AI-generated images can be used to rehabilitate certain pedophiles, by allowing them to gain the sexual catharsis they would otherwise get from watching child pornography from generated images instead, or by practicing impulse management on those images so that they can better control their urges. And with more resources for treatment available and less stigma attached to them, more pedophiles might feel prepared to seek help in the first place.

In addition, some of most widespread fears about the risk of AI-generated child porn misunderstand the nature of pedophilia. First, there is no evidence that individuals without pedophilic proclivities will become interested in viewing child pornography merely because it becomes accessible in an AI-generated form. Then there is the gateway fear: If virtual imagery of child sexual abuse is enabled, is that not the first step toward hands-on offenses? What about the temptations posed for those who would not otherwise watch child pornography in the first place?

While there is a correlation between viewing child pornography and committing hands-on offenses, the scientific research on the topic does not indicate that the former causes the latter. “It’s very clear there is a group of individuals for whom looking at these images is a voyeuristic interest and an end in and of itself, not a gateway to something more severe,” said Berlin. “People who are looking at child pornography who are also going to approach a child sexually—the reason that group is looking at child pornography is they already have an interest in being sexual with a child.”

Plus, redirecting potential viewers of child pornography to AI-generated images could help victims by preventing their images from being continually viewed, either by pedophiles or by law enforcement, who often spread existing child pornography on the internet to entrap potential viewers. “If there were a lot of AI-generated images out there, it would clearly dilute dramatically the percentage of those that would be actual children, and therefore there would be far fewer viewing[s] of images [of] people who have been damaged by being a part of it,” Berlin added.

Ultimately, AI-generated child pornography could act as a form of harm reduction, a philosophy that underlies many public health policies. It is similar to the logic, for example, behind needle exchange programs for individuals suffering from drug addiction: Because we cannot stop drug use wholesale, it serves society to find ways to ensure that consumption occurs safely. This is not a perfect analogy, and it is understandably a tough sell to transfer this framework to a subject as horrific as images depicting the abuse of children, especially without studies demonstrating the effects of controlled viewings of AI-generated child pornography. But our rightful contempt for child pornography should not prevent us from considering the possibility that fake forms of such images could stand as an improvement over abuse.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

For pedophiles who do not wish to harm children and therefore find satisfaction in ways that avoid doing so, “we consider this a good outcome for them, for children that might otherwise have been victimized, and for society at large,” wrote an anonymous representative from Virtuous Pedophiles, an online support group for non-offending pedophiles, to me over email.

Of course, using AI-generated images as a form of rehabilitation, alongside existing forms of therapy and treatment, is not the same as allowing its unbridled proliferation on the web.

“There’s a world of difference between the potential use of this content in controlled psychiatric settings versus what we’re describing here, which is just, anybody can access these tools to create anything that they want in any setting,” said Portnoff, from Thorn.

And even using AI-generated child porn in a controlled environment would not be a one-size-fits all solution. Clinicians evaluate patients on an individual basis and would have to determine whether exposure to explicit images could diminish urges for a given patient, according to Berlin, or whether they might enhance them in that particular case.

Ultimately, though, incorporating AI-generated images into existing forms of therapy could be one way of diminishing risk. “We’re all for the same thing, which is the safety of children and others in the community,” said Berlin. “We have to do both sides of the coin. We not only have to assist the victims, but we have to look at those who might be a risk to victims, and help them not to remain a risk.”

About Danielle Bernstein

Check Also

A Dangerous New Home for Online Extremism

Can you imagine what a digital white ethnostate or a cyber caliphate might look like? …

Leave a Reply