Artificial intelligence (AI) and indecent images

Written 26th February 2024 by Ruth Peters

What is Artificial Intelligence (AI)? 

AI, or Artificial Intelligence, refers to the simulation of human intelligence in machines that are programmed to think, learn, and perform tasks typically requiring human intelligence. It involves the development of algorithms and models that enable computers and other systems to mimic cognitive functions such as problem-solving, decision-making, speech recognition, visual perception, language understanding, and learning from experience. 

What is text-to-image AI? 

Text-to-image AI refers to artificial intelligence systems that are capable of generating visual content, such as images or illustrations, based on textual input. These systems use advanced machine learning techniques, often leveraging deep learning models, to understand and interpret textual descriptions and then generate corresponding visual representations. 

The process typically involves training a neural network on a large dataset that contains pairs of text descriptions and corresponding images. The network learns the associations between specific textual features and the visual features of the images. Once trained, the model can take new textual input and generate images that align with the given descriptions. 

Like any technology, text-to-image AI has the potential for both positive and negative applications, and its ethical use depends on how it is implemented and deployed. 

Concerns about AI being used to generate indecent images 

In June 2023, the Internet Watch Foundation (IWF) started receiving its first reports from members of the public concerned that the content they may be seeing online were indecent images of children that had been created using Artificial Intelligence. Over five weeks their hotline received 29 reports of suspected AI-generated child sexual abuse imagery and IWF were able to confirm that seven webpages contained criminal imagery. 

Over the past twelve months, technology used to create AI-generated images, based on text to image-based models has developed to astounding levels of accuracy. 

In December 2023 the IWF submitted written evidence to the to the Science, Innovation, and Technology Committee Inquiry: Governance of Artificial Intelligence (AI). 

The IWF stated: 

“A year ago, it would have been easy for our analysts to distinguish between an image that had been generated using Artificial Intelligence or was computer generated compared with an image which had not. By June, this was clearly becoming more difficult for our analysts, with often the only tells in the images we were receiving being the children in the images only having three toes, the fingers or hands not being quite the way you would expect, or beds blurring into the background in photos. In the last six months, many of these imperfections have been corrected, meaning it is now extremely difficult to tell the difference between a real child and one generated using AI, which presents clear and obvious challenges for law enforcement and victim identification. 

How can AI be used for AI-generated child sexual abuse imagery? 

There are three main types of image editing software that are utilised in the creation of AI-Generated Child Sexual Abuse Imagery.  

  • Inpainting: allows users to change elements of an image, Inpainting is a process used in image processing to fill in missing or damaged parts of an image. The goal of inpainting is to reconstruct the missing or damaged regions in a visually plausible and coherent manner, based on the surrounding information in the image. This is most commonly used to correct AI-generated physical deformities such as correcting the number of fingers and toes in an image. 
  • Open pose:  Open pose refers to a computer vision library that specializes in human pose estimation. Human pose estimation involves detecting and locating key points on a person’s body, such as joints and body parts, within an image or video. Open pose enables users to copy the composition of existing images and is most commonly used in the creation of AI-generated child sexual abuse images and in pornography to insert other people’s images into sexual positions.  
  • Roop: enables users to swap the faces of individuals. This has been seen most used to create deepfakes and to add the faces of well-known individuals into scenes where they are depicted as sexually abusing children. 

Concerns about children using AI to generate child sexual abuse imagery 

There are also significant concerns about children themselves using AI to generate child sexual abuse imagery without fully understanding the consequences or indeed the fact they may be committing criminal offences. 

David Wright, Director at UKSIC and CEO at SWGfL, said children may be exploring the potential of AI image generators without fully appreciating the harm they may be causing, or the risks of the imagery being shared elsewhere online.  

He said:  

“We are now getting reports from schools of children using this technology to make, and attempt to make, indecent images of other children. 

“This technology has enormous potential for good, but the reports we are seeing should not come as a surprise. Young people are not always aware of the seriousness of what they are doing, yet these types of harmful behaviours should be anticipated when new technologies, like AI generators, become more accessible to the public.

“We clearly saw how prevalent sexual harassment and online sexual abuse was from the Ofsted review in 2021, and this was a time before Generative AI technologies. 

“Although the case numbers are currently small, we are in the foothills and need to see steps being taken now, before schools become overwhelmed and the problem grows. An increase in criminal content being made in schools is something we never want to see, and interventions must be made urgently to prevent this from spreading further. 

“We encourage schools to review their filtering and monitoring systems and reach out for support when dealing with incidents and safeguarding matters.”  

What is the law surrounding AI and indecent images? 

AI-generated images of child sexual abuse are illegal in the UK. Possessing, making, and distributing indecent images of children is an offence in the UK, irrespective of whether it is a real image or a pseudo-image (one created using AI or any other technology). 

Making indecent images and AI 

The main legislation used to prosecute indecent images cases is the Protection of Children Act 1978.  

Section 1(1) (a) of the Protection of Children Act 1978 includes “to take, or permit to be taken [or to make], any indecent photograph [or pseudo-photograph] of a child”.  

Following the case of R v Bowden [2000] 1 Cr. App. R. 438 ‘making’ indecent images is defined as follows “to cause to exist, to produce by action, to bring about” indecent images. The court’s interpretation of ‘making’ indecent images is broad and the following can amount to making indecent images; opening an email attachment, downloading an indecent image, storing an image, and accessing a website where an indecent image “pops up”. 

It seems clear that using AI to generate child sexual abuse imagery will fall within the definition of ‘making’. 

Prohibited images and AI 

The Coroners and Justice Act 2009 criminalises the possession of “a prohibited image of a child”. These are non-photographic – generally cartoons, drawings, animations or similar. 

Under the Coroners and Justice Act of 2009, it is an offence to possess a prohibited image.  A prohibited image is described as one which is ‘pornographic’ and ‘grossly offensive, disgusting or otherwise of an obscene character’.  It must also satisfy further criteria detailed within the Act itself.  In short, the image must either concentrate on the genitalia of a child, or show a sexual act, whether it be masturbation, penetration by way of a penis or any other object, oral, vaginal or anal sex, or sex with an animal, which either actively involves a child, or a child is shown to be present during the act. 

Prohibited images, however, cannot be photographs or pseudo-photographs (photos which have been amended or photo-shopped for example).  They cannot show a real-life image of a person.  

They are, therefore, sketches, paintings, cartoons or any other unreal ‘depiction’ of a person. 

Paedophile manuals and AI 

Section 69 of the Serious Crime Act 2015 created the offence of being “in possession of any item that contains advice or guidance about abusing children sexually”. This is known as a paedophile manual.  

If a defendant has material containing advice or guidance about how to ‘make’ indecent photographs of children, they will likely be committing an offence under this section.  

It seems unclear at present whether the use of AI for the purposes of generating indecent images of children could constitute an offence under the Serious Crime Act 2015. 

Is more legislation or better guidance required for AU generated child sexual abuse imagery? 

It has been suggested that the Crown Prosecution Service’s guidance on indecent photographs of children could be updated to make the laws surrounding AI-generated imagery clearer.  

What is clear is that the increased use of Ai in relation to generation of indecent images will undoubtedly have a significant impact on police and other law enforcement resources. 

The IWF commented: 

Additionally, the potential volume of AI-generated pseudo-images will likely have a major impact on law enforcement resources focused on identifying and safeguarding real world victims of this horrific offending.  

If AI imagery of child sexual abuse becomes indistinguishable from real imagery, there is a danger that IWF analysts could waste precious time attempting to identify and help law enforcement protect children that do not exist.  

Article written by Ruth Peters, Director at Olliers 

If you require advice in relation to an indecent images investigation please contact our new enquiry team either by email to info@olliers.com, or by telephone on 020 3883 6790 (London) or 0161 834 1515 (Manchester) or by completing the form below. 

Ruth Peters

Ruth Peters

Business Development Director

Manchester

Head Office

London

Satellite Office

If you would like to contact Olliers Solicitors please complete the form below

Contact Us 2023
Preferred method of contact