The Internet Watch Foundation, a charity that monitors and removes child abuse imagery online, has reported removing over 100,000 web pages containing images of abuse in 2018 - all of which had to be viewed by analysts.
Emma Hardy, director of communications at the Internet Watch Foundation (IWF), said the organisation had greatly reduced the number of abusive imagery hosted from the UK.
“The UK really flies the flag for doing such a great job in this area, so much so that when we were set up in 1996, 18% of the world’s child sexual abuse imagery was hosted in the UK.
- Read more: BBC editor found not guilty over the naming of sexual abuse victim in live broadcast
- Read more: Sexual abuse is ‘a worldwide problem for the church’, says child protection campaigner
“There’s less than half a percent now because of the work we’ve been doing,” she told talkRADIO’s Mike Graham.
While the company - which works with law enforcement and internet companies to get the images removed - has made strides in taking down abuse content, Ms Hardy said there is still horrifying imagery out there, much of it hosted outside the UK.
'Some images contain babies'
“Last year we managed to take down more than 100,000 web pages,” she said.
“What we are concerned about is the amount of imagery that’s out there that isn’t removed as quickly as could be.
“Some of this imagery is featuring children as young as newborns and babies. We go out there and look for this imagery and take reports from the public when they find it, and we trace where it’s hosted, the physical location around the world, and then contact our partners in that country.
“We will chase it up and chase them up until it’s remove. We’re incredibly effective at what we do, but it’s out there in the first place, and it shouldn’t be.”
In cases where the location of the child in the images can be identified, the IWF will pass this information to authorities to assist in rescuing the child.
“It’s very difficult to know exactly where these children are in the world, but where we can tell where they are, we pass that information to law enforcement and we have had some great success stories of children being rescued,” she explained.
'Resilient people' review and remove images
The gruelling job of looking at the imagery in order to identify where it’s being hosted is done by 13 analysts, who are carefully recruited and given psychological support.
“They viewed every single one of those 100,000 websites last year to check what they needed to get removed," Ms Hardy said.
“We need resilient people who are compassionate and can handle that type of environment and that kind of content.
- Read more: Rotherham MP calls for ‘immediate guidance’ after Council invited rapist to apply for parental rights to victim’s son
- Read more: Using lie-detectors could lead to ‘lazy policing’, says domestic abuse lawyer
“We send them through quite an extensive recruitment process and look at lots of things to do with how they view and think about things, they have annual psychological assessments, we give them counselling every month, and before we employ them we show them the imagery and we let them think about it.
“Once upon a time we’d expect about 50% of people to drop out, but now most people go on to take the job. We are a happy bunch of people and we really support our staff.”
Not all on dark web
It is a “common misconception,” Ms Hardy said, that a lot of abuse imagery is on the dark web. “The majority is on the open web that you and I can access easily,” she said, and explained how hashing technology - that creates a digital fingerprint of images and websites when they are found - is used to ensure they’re not reuploaded elsewhere.
Anyone with concerns about inappropriate content can make an anonymous report through the IWF’s website at https://www.iwf.org.uk/