CYFOR Blog

The latest industry news and insights

Deepfake: New legislations on sexually explicit deepfake images

deepfake images

UK Government have announced a new law making it a criminal offence to create sexually explicit deepfake images of another person without their consent

What Is Deepfake?

Deepfakes are Artificial Intelligence (AI) generated images or videos made using specific deepfake software, such as OpenAI, to digitally manipulate its contents and in most cases, an individual’s face, body, or voice to make them appear as someone else.

Deepfakes are hard to detect by the naked eye but can be examined by digital forensic experts to determine their authenticity.

Are Deepfakes Illegal?

As of January 31st, the UK’s Online Safety Act had made it illegal to share non-consensual imagery, though the creation of such content was still legal.

A survey carried out by ESET, a global leader in cybersecurity, revealed that 50% of British women increasingly worry about becoming a victim to the creation of deepfake pornography. As a response to these growing concerns, the UK Government have announced new legislations, now making it illegal to both create and share any un-consensual deepfake images generated by AI and apps.

Effectively, those convicted are now subject to face an unlimited fine as well as prosecution and if shared publicly, can be looking to face jail time. According to the Ministry of Justice, the law applies to anyone who commits the offence of creating the image, regardless of whether they had any intent to share.

This law marks a significant stride in addressing the form of abuse particularly affecting women and young girls. As stated by Laura Farris, Minister for Victims & Safeguarding:

“It is another example of ways in which certain people seek to degrade and dehumanise others – especially women. And it has the capacity to cause catastrophic consequences if the material is shared more widely. This government will not tolerate it … This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime.”

Celebrity Deepfakes

An investigation conducted by Channel 4 in March 2024 discovered that over the past number of years thousands of celebrities have become an easy target due to the readily available access to their images.

The likes of American singer-songwriter, Taylor Swift and Actress, Jenna Ortega have fallen victim to the offence, having their non-consensual deepfake images made viral across Facebook, Instagram & X (formerly Twitter). Undressed images of Jenna Ortega, underaged at 16, received millions of views and as a result, had to be removed by Meta.

The investigation also found that hundreds of British celebrities had been subject to deepfake porn, including former Love Island star, Cally Jane Beech, and Chanel 4’s very own presenter, Cathy Newman. When speaking out about her experience earlier this year, Beech stated:

“What I endured went beyond embarrassment or inconvenience.”

Beech also claimed she views the strengthening of the laws as a significant move to better protecting women.

Back to all Posts

Call us today and speak with a Forensic Specialist

Send an enquiry to our experts

After submitting an enquiry, a member of our team will be in touch with you as soon as possible

Your information will only be used to contact you, and is lawfully in accordance with the General Data Protection Regulation (GDPR) act, 2018.