Vodafone has been supporting the Internet Watch Foundation's vital online safeguarding and investigative work for 20 years.
The Internet Watch Foundation (IWF) is a not-for-profit charity independent of government and law enforcement. It has a mission to remove child sexual abuse images from the internet.
This may seem like a depressingly impossible task in the era of encrypted digital communications, but in the 20 years since Vodafone has been supporting the charity’s work, there have been some significant successes, says Michael Tunks, the IWF’s senior policy and public affairs manager.
In 2021, the IWF was successful in removing 252,000 web pages containing child sexual abuse imagery, which “is millions of images”, he says.
In 1996, the year the IWF was founded, 18% of the world’s child sexual abuse imagery was hosted in the UK. But today that’s less than 1%, and has been at that level since 2003, he adds.
In 2014, the UK Government and Crown Prosecution Service gave the IWF permission to not only receive public reports but use their analysts’ skills and knowledge to proactively seek this content online. “This was a gamechanger for us, it made a huge difference to our detection rate – it went up by 137% in the first year,” Mr Tunks says.
in the 20 years since Vodafone has been supporting the charity’s work, there have been some significant successes
Also key to this success has been a close partnership with telco companies like Vodafone, internet service providers, tech companies, government and law enforcement, he believes.
Globally, there is still an epidemic of such imagery flooding the internet, however. Facebook owner, Meta, sends around 20 million reports of child sexual abuse imagery to the US Center for Missing and Exploited Children every year, he says.
In the UK, the police are arresting around 750 people a year for offences relating to child sexual abuse imagery. But the National Crime Agency estimates that 500,000 to 850,000 people in the UK pose a sexual threat to children, online or offline.
Sadly, much of the imagery currently being generated is by children themselves, with children having been groomed or coerced into performing sexual acts over webcams in their own homes. There has been a 374% increase in such imagery over the last two years, says Mr Tunks, exacerbated by the pandemic lockdowns. Self-generated imagery now accounts for nearly three-quarters of all the material the IWF is taking down, he says.
Why is this?
“Children have been spending more time online during the pandemic – as we all have – and smartphones and social media are ubiquitous now,” he says.