Sharing intimate images of someone without their consent and with intent to cause distress has been illegal since 2015 and since 2021 it has also been illegal to threaten to share such images thanks to Refuge’s Naked Threat campaign. As a result of sustained campaigning by Refuge and allies, the Online Safety Act made the sharing of AI-generated intimate images without consent illegal. Recent reports of sexually explicit ‘deepfake’ images of Taylor Swift circulating in the news have highlighted the reality that women and children are disproportionally targeted by generative AI. The Act also brought in further changes around sharing and threatening to share intimate images without consent.
• It is illegal to share an intimate image without consent. This means it will no longer be necessary to prove the perpetrator’s motivation to cause harm to prosecute. Perpetrators could go to prison for up to 6 months if found guilty, or 2 years if it is proven the perpetrator also intended to cause distress, alarm or humiliation, or shared the image to obtain sexual gratification.
• Survivors of intimate image abuse will be granted automatic lifelong anonymity and automatic eligibility for special measures during the trial i.e. giving evidence via video.
• Sending unsolicited sexual images, also known as ‘cyberflashing’, has been criminalised. However, this offence depends on proof of the perpetrator’s intent to cause harm or gain sexual gratification, which can be difficult to prove.
The government will also bring forward a package of additional laws to tackle a range of abusive behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.
These will cover so-called ‘downblousing’ – where photos are taken down a woman’s top without consent, allowing police and prosecutors to pursue such cases more effectively.
The amendment to the Online Safety Bill will broaden the scope of current intimate image offences, so that more perpetrators will face prosecution and potentially time in jail.
The Domestic Abuse Commissioner, Nicole Jacobs, said:
I welcome these moves by the government which aim to make victims and survivors safer online, on the streets and in their own homes.
I am pleased to see this commitment in the Online Safety Bill, and hope to see it continue its progression through Parliament at the earliest opportunity.
Around 1 in 14 adults in England and Wales have experienced a threat to share intimate images, with more than 28,000 reports of disclosing private sexual images without consent recorded by police between April 2015 and December 2021.
The package of reforms follows growing global concerns around the abuse of new technology, including the increased prevalence of deepfakes. These typically involve the use of editing software to make and share fake images or videos of a person without their consent, which are often pornographic in nature. A website that virtually strips women naked received 38 million hits in the first 8 months of 2021.
Earlier this year, the ‘ENOUGH’ campaign launched to tackle violence against women and girls. The campaign gives bystanders safe ways to intervene if they witness violence against women and girls, including sexual harassment on the street, unwanted touching, sharing intimate images of someone without their consent and coercive control in a relationship. Through the government’s Tackling Violence Against Women and Girls Strategy, the Home Office increased its funding to the Revenge Porn Helpline in 2021/2 to £120,000 to support victims of non-consensual intimate image sharing. Under the Tackling Domestic Abuse Plan, the Home Office increased this further to £150,000 in 2022/3. Since 2015 when the Helpline was established it has supported nearly 16,000 people and removed over 270,000 individual pieces of content.