Women are being ‘failed’ by revenge porn laws because most perpetrators are avoiding justice, official data suggests.
Just 3 per cent of suspects were charged in 40,000 cases probed by police over the past five years.
The problem – in which intimate private photos or video footage are shared without consent by a former partner – is likely to grow, as experts warn that AI apps capable of generating lifelike fake sexual images will make it easier for abusers to humiliate victims.
Thousands of distressing cases were left unsolved or closed over the five-year period due to a lack of crucial evidence, the Home Office statistics revealed.
The data showed that 40,110 offences were reported to the police – roughly one every hour. Yet just 3.2 per cent of cases ended with a criminal charge.
Shadow Home Secretary Chris Philp said last night: ‘The statistics are absolutely staggering. We must do more to ensure perpetrators are held accountable and victims are properly supported so the system doesn’t fail women.
‘We were unapologetic in government about standing up for women and protecting victims of domestic abuse.
‘We toughened up sentences for rapists and stalkers, outlawed upskirting and revenge porn, and made violence against women and girls a national policing priority – but it is clear more needs to be done.’
Love Island star Georgia Harrison took her ex-boyfriend to court for uploading online sexual footage of her in 2023.
Ms Harrison, 31, gave evidence against Stephen Bear, 35, at Chelmsford Crown Court, where he was jailed for 21 months for voyeurism and two counts of disclosing private sexual images without consent.
Experts fear the number of cases could soar as culprits use so-called ‘nudification’ apps, which can edit an ordinary photograph of a person to make it appear that they are naked.
Other powerful AI software can be used to insert a victim’s face into sexually explicit pictures or videos – known as deep-fakes – such as the high-profile clips of pop star Taylor Swift that caused outrage last year.