AI Represents New Child Sex Abuse Frontier


Artificial intelligence, also known as AI, may represent the next big challenge for law enforcement focused on finding and stopping child sexual exploitation, according to a recent report.  The National Police Chief’s Council of the United Kingdom, reported in the BBC, stated “outrage” at the misuse of certain online content-sharing platforms to distribute imagery created using artificial intelligence, including life-like child sex abuse material.

Investigators have found that a software known as Stable Diffusion, originally intended to generate images for graphic design purposes, has been utilized to create images of child sexual abuse.  The program allows users to use word prompts to cue the software to create images; apparently, some users offer prompts that result in life-like child pornography.  The images range in age from teens to toddlers and include male and female subjects.

In the UK, a pseudo image generated by AI is considered the same, legally speaking, as a real image.  It is illegal to generate, sell, transmit, or possess any such image, and doing so carries hefty fines and jail time as possible punishments.

The National Police Chiefs’ Council has held that AI-generated child pornography can be as harmful as the real thing.  A pedophile could gain the same mental and physical stimulation from these images that could lead to a real-world offense against a child.  Therefore, in the UK, these images are treated no differently than any other type of child pornography.

However, tracking down the creators and distributors of such images can be difficult.  In general, AI-created porn reaches users through a three-step process:

  • First, a creator generates the images using some AI software. In most cases, this is done using keyword prompts, although other types of software are in constant development that may also use other forms of input.
  • Images are promoted through some sharing platforms. These platforms have safeguards to prohibit child pornography, but many images manage to slip through.  In other cases, “dark web” sites shared through secret communications to users feature images without restriction.
  • Accounts are linked to users who view and/or download porn directly through their own devices.

The problem with tracking down and stopping the sharing of these images rests with the fact that many of the host sites are located worldwide.  Countries may not have jurisdiction over the host site; therefore, they often rely on catching the “end users,” which can be difficult and time-consuming.

Ultimately, the market for child pornography indicates a deeper problem rooted in pedophilia.  Until universal safeguards are in place to protect children from this type of exploitation, it will be difficult, if not impossible, to stop it completely.

If you believe that you or your family member has been a victim of sexual abuse or exploitation, contact National Injury Attorneys toll-free at 1-800-736-5300.  We are here to confidentially help you pursue justice and hold those accountable for your pain, suffering, and trauma.