Features | 07 Jul 2022

The watchdog fighting to protect children from online sexual abuse

Vodafone has been supporting the Internet Watch Foundation's vital online safeguarding and investigative work for 20 years.

The Internet Watch Foundation (IWF) is a not-for-profit charity independent of government and law enforcement. It has a mission to remove child sexual abuse images from the internet.

This may seem like a depressingly impossible task in the era of encrypted digital communications, but in the 20 years since Vodafone has been supporting the charity’s work, there have been some significant successes, says Michael Tunks, the IWF’s senior policy and public affairs manager.

In 2021, the IWF was successful in removing 252,000 web pages containing child sexual abuse imagery, which “is millions of images”, he says.

In 1996, the year the IWF was founded, 18% of the world’s child sexual abuse imagery was hosted in the UK. But today that’s less than 1%, and has been at that level since 2003, he adds.

In 2014, the UK Government and Crown Prosecution Service gave the IWF permission to not only receive public reports but use their analysts’ skills and knowledge to proactively seek this content online. “This was a gamechanger for us, it made a huge difference to our detection rate – it went up by 137% in the first year,” Mr Tunks says.

in the 20 years since Vodafone has been supporting the charity’s work, there have been some significant successes

Also key to this success has been a close partnership with telco companies like Vodafone, internet service providers, tech companies, government and law enforcement, he believes.

Growing problem

Globally, there is still an epidemic of such imagery flooding the internet, however. Facebook owner, Meta, sends around 20 million reports of child sexual abuse imagery to the US Center for Missing and Exploited Children every year, he says.

In the UK, the police are arresting around 750 people a year for offences relating to child sexual abuse imagery. But the National Crime Agency estimates that 500,000 to 850,000 people in the UK pose a sexual threat to children, online or offline.

Sadly, much of the imagery currently being generated is by children themselves, with children having been groomed or coerced into performing sexual acts over webcams in their own homes. There has been a 374% increase in such imagery over the last two years, says Mr Tunks, exacerbated by the pandemic lockdowns. Self-generated imagery now accounts for nearly three-quarters of all the material the IWF is taking down, he says.

Why is this?

“Children have been spending more time online during the pandemic – as we all have – and smartphones and social media are ubiquitous now,” he says.

Social media and parental controls: Everything you need to know

Digital Awareness UK's Emma Robertson takes us through the social media safety features we definitely need to know about.

This has provided fertile ground for abusers to groom and trick children into taking and sharing images they shouldn’t, he says. But also, a lot of the imagery is peer-to-peer – teenagers video ‘sexting’ each other in other words – and then those images leak, often after a relationship has broken down.

“A child might not necessarily know that what they are doing could be captured and used on a dedicated child sexual abuse imagery website,” says Mr Tunks.

Encryption conundrum

How much of an issue for the IWF is end-to-end encryption on messaging apps?

“It’s a huge challenge,” admits Mr Tunks. “At the moment there’s no way of identifying child sexual abuse imagery in an end-to-end encrypted environment, so companies shouldn’t be offering these services unless they have child protection measures in place.”

There should be a way of providing a balance between the right to privacy and freedom of speech, and the need for child protection measures, he argues. “At the very least” tech companies should be “developing and deploying tools that detect, flag and report known child sexual abuse imagery,” he says.

The IWF has been working closely with Vodafone and other providers to address such issues, including the encrypted ‘DNS [Domain Name System] over https’ protocol that Firefox browser company Mozilla has already introduced in the US.

“This kind of encryption could override our URL blocking lists and bypass any parental controls parents have set up,” Mr Tunks warns.

Teaching your kids to question what they see online, on TV and in the papers

On Safer Internet Day, we give parents tips on how to protect kids against dodgy TikTok videos, Facebook conspiracy theories, fake news and unreliable social media influencers.

The IWF’s lists of banned or blocked websites currently protect “about 80% to 90% of internet users in the UK” because all the big four mobile network operators use them, he says. And they clearly work. In the first pandemic lockdown these block lists prevented 8.8 million attempts to access known illegal child sexual abuse imagery, he reports.

If this new encrypted technical standard were to become more widespread, internet users could be left unprotected from illegal child abuse content, the IWF argues.

So far, there are no plans to introduce ‘DNS over https’ in the UK, but the IWF is lobbying hard to ensure child protection measures are also built into any new privacy-based internet standard.

‘Emotional resilience’

The IWF provides secure places where people around the world can anonymously report instances of child sexual abuse imagery. But most of the work is done by its specialist team of analysts proactively searching the web for this kind of highly disturbing content.

IWF analysts have to go through a rigorous vetting process, including enhanced DBS checks, and receive regular mandatory counselling every six weeks to help them cope with the harrowing images they have to view. They also have to undergo annual psychological assessments.

analysts need “emotional resilience” to cope with the work

Analysts come from “a wide range of disciplines”, says Mr Tunks, including law enforcement, gaming, research, and have to be able to process “lots of information very quickly”. Good observational skills are essential, too, because close analysis of an image can help identify a location and potentially rescue a child, he says.

For example, one analyst rescued a girl after identifying the school crest on her jumper. Once they’ve gathered “actionable intelligence” they then inform law enforcement who can intervene to rescue the child and bring the offenders to justice.

But most importantly, analysts need “emotional resilience” to cope with the work.

“It’s really important that they have a good support network around them, both in and outside work,” he says.

IWF analysts are committed to what they do, he says, because “they really feel they’re doing something socially valuable. Getting these horrendous images removed from the internet really drives a lot them.

“You can go home at the end of the day knowing you’ve done some good for children.”

Stay up-to-date with the latest news from Vodafone by following us on Twitter and signing up for News Centre website notifications.