Nudify apps and deep fakes – The law

Written 10th June 2024 by Ruth Peters

What is a “nudify ap”?

The growth of ‘nudify’ apps and the increasing use of deep fakes across the internet means that legislation needs to keep up to date as technology quickly develops. In this blog we consider the legality of ‘nudify’ apps and how deep fakes are dealt with from a legal perspective. A “nudity app” is a specific type of photo editing application designed to simulate nudity in images, typically by altering or removing clothing in photos of people. These apps use various techniques, including artificial intelligence and machine learning, to create realistic-looking alterations.

Nudify apps allow users to upload photos and apply filters or editing tools to make the subjects appear nude or semi-nude. This might involve digitally removing clothing or altering the image to create the illusion of nudity. Many of these apps use advanced image processing algorithms and AI technologies to achieve realistic results. They might employ techniques like deep learning, which involves training neural networks on large datasets of images to accurately simulate the desired effects.

The use of nudify apps raises significant ethical and privacy concerns. Creating or sharing altered images without the consent of the person depicted can be considered a violation of privacy and can be harmful or exploitative.  Nudify apps have been criticized for their potential to be misused for harassment, revenge porn, and other malicious activities.

What is a deep fake?

A deep fake is a synthetic media—typically video, audio, or images—created using artificial intelligence and machine learning techniques, especially deep learning. These techniques allow for the realistic alteration or creation of content that appears authentic. Here are key points about deep fakes:

Deep fakes utilize deep neural networks, particularly generative adversarial networks (GANs), to create highly realistic fabrications of people saying or doing things they never did.

While they can be used for legitimate purposes such as entertainment, satire, or in the film industry, they can often be associated with malicious activities compromising personal privacy.

Detecting deep fakes is challenging because the technology continuously evolves, making the fabrications increasingly convincing. The rise of deep fakes raises significant ethical and legal concerns, including issues related to consent, defamation, and the potential to deceive and manipulate public opinion.

What is the law in relation to sharing sexually explicit deep fake content?

Essentially, it is currently illegal to share sexually explicit deep fake content, but not create the content itself

The current law

Sharing Intimate Photographs or Film – Section 66B (1) – (3) Sexual Offences Act 2003
The offences of sharing intimate photographs/film are committed when an offender [A] intentionally shares a photo or film which shows, or appears to show, another person [B] in an intimate state, and:

References to a photograph or film are defined within the act to include deep fakes as follows:

  1. (a) an image, whether made or altered by computer graphics or in any other way, which appears to be a photograph or film,
  2. (b) a copy of a photograph, film or image within paragraph (a), and
  3. (c) data stored by any means which is capable of conversion into a photograph, film or image within paragraph (a).

Further legislation in relation to deep fakes

On the 16th April 2024 the Ministry of Justice announced a new criminal offence was to be introduced so that individuals who create sexual explicit deep fakes could be prosecuted. Whilst current legislation makes the sharing of such material illegal it is currently not illegal to create such material. The new legislation was intended to criminalise the creation of deep fakes. It would also allow those individuals who create deep fakes and then share such material to be charged with two separate offences and potentially face an increased sentence. 

The legislation was to be created under an amendment to the Criminal Justice Bill and was due to progress through parliament. However, following the calling of the General Election, parliament was ‘dissolved’ on the 24th May 2024 and accordingly the legislation will not currently progress further. 

The legislation was criticised however for requiring the creator of the deep fake to have an intention of causing alarm, humiliation and distress to the victim as opposed to simply the victim not consenting to their images being used in such way. Indeed the amendments introduced by the Online Safety Act namely, section 66B of the Sexual Offences Act in relation to sharing of intimate photographs removed the requirement to prove the intent of the perpetrator.
The creation of deep fake images has skyrocketed in the past few years with advancements in technology. Many groups have called for better regulation of the companies that profit from such technology for example those that allow the creation of such material.

In February 2024 a coalition of 44 specialist organisations and experts of Violence Against Women and Girls (VAWG) wrote an open letter to the Chief Executive of Ofcom expressing concerns with the regulator’s approach to tackling illegal online content.  They expressed a lack of confidence in Ofcom’s interpretation of the Online Safety Act and called on the regulator to urgently change course.
The future for additional deep fake legislation?

It seems likely that any incoming government will support legislation in relation to nudify apps.  Certainly, a Labour Together Policy paper in March 2024 suggested that should the Labour Party win the next election they may in fact go further than the proposals suggested under the Criminal Justice Bill indeed introducing a general prohibition against nudify apps as well as stricter rules for AI developers to ensure their technology is not used to make harmful deep fakes.  It also proposed measures be introduced for web hosting companies to ensure they are not involved in the distribution or creation of harmful deep fakes.

Ofcom as the communications regulator is responsible for penalising those who don’t take sufficient measures to protect their users.  The Online Safety Act allows the regulator to take action against platforms and can issue financial penalties. 

Deep fakes and under 18s

Whilst the creation of material using nudify apps in relation to adults is in itself not a criminal offence, the situation is different in respect of those aged under 18.  Possessing, making and distributing indecent images of children is a criminal offence in the UK irrespective of whether it is a real image or a pseudo-image – a pseudo-image being an image being created by AI, computer graphics or any other technology which otherwise appears to be a photograph.

The main legislation used to prosecute indecent images cases is the Protection of Children Act 1978. 
Section 1(1) (a) of the Protection of Children Act 1978 includes “to take, or permit to be taken [or to make], any indecent photograph [or pseudo-photograph] of a child”.   It seems clear that using AI to generate child sexual abuse imagery could fall within the definition of ‘making’.

However, naked pictures of children themselves are not necessarily indecent. The lowest level of indecent images are known as Category C images and are defined as ‘images of erotic posing’.  There may be cases where an image is not necessary erotic or posing but could still be classed as indecent, for example, a naked picture of a child not engaged in sexual activity but with a focus on the child’s genitals.’

How can Olliers help if you are accused of intimate photograph offences?

At the pre-charge investigation stage, we can communicate with the police on your behalf. We can prepare representations in relation to either mitigation if the offence is admitted, or setting out your case if the offence is denied. 

If you are charged with this offence, we can represent you at court, with one of our specialist solicitors presenting your case. We will advise you on the strength of the evidence and the sentencing guidelines in the context of your case. If you are entering a guilty plea, we will do our very best to ensure you receive the lowest sentence possible. If you are entering a not guilty plea, we will put forward your case to the court, ensuring you have the best chance possible of the desired outcome.

Need a solicitor for deep fake or intimate photograph allegations?

Contact Olliers specialist team to arrange advice and representation in relation to intimate photograph allegations by completing the form below, telephoning 0161 8341515 or by emailing info@olliers.com.

Ruth Peters

Ruth Peters

Business Development Director

Manchester

Head Office

London

Satellite Office

If you would like to contact Olliers Solicitors please complete the form below

Contact Us 2023
Preferred method of contact