Catching paedophiles using image-processing technology

Scanned hand
Image caption Technology can help analyse images but sometimes people are better

When the police catch up with paedophiles, they can be sure of one thing - they will find lots and lots of images and videos.

A signature trait they share is the collecting and curating of a large store of images of children being sexually and physically abused.

The only positive aspect of these horrific caches is that they help police officers build a case. The downside is that it can take time to process the images because the collections can be huge. Some abusers hoard millions of them.

"Image triage is very important and it's getting harder because storage is so cheap," said Mick Moran, co-ordinator of the crimes against children unit at Interpol.

Analysing all the images was vital, he said, not just to ensure that the person caught was properly charged.

"The material has to be sorted through with a view to finding images that have not been seen before and require immediate attention."

That novel material could reveal victims unknown to the police or lead to other abusers.

"60-90% of abuse takes place within the home or family circle," said Mr Moran. "Once you find the child, you find the abuser."

Find and filter

Technology is a big help when processing the mountains of images. At Interpol the sorting process is aided by a database made up of millions of images seized from abusers. That contains the raw images and each one has been processed to produce a mathematical summary known as a "hash".

Image caption The caches of images built up by paedophiles can be extensive

The first stage of triage typically involves putting freshly seized images through the same hashing process. Old are then compared with new to reveal those not see before.

That is the theory. In practice it often proved to be a lot more complicated, said Fred Langford, director of technology at the UK's Internet Watch Foundation.

"As soon as someone changes the size or flips it on its side it changes the hash," said Mr Langford.

That was a bigger problem that it might otherwise be, he said, because images of child sexual abuse were so widely copied.

"We see the same images over and over again," said Mr Langford. Websites selling images of abuse often added a logo, he said, which typically changed the mathematical summary or hash .

Many paedophiles who swapped and shared material knew to make tiny changes, such as altering the colour of a single pixel or altering a file extension, to stop the images being caught by scanning systems, said Alex De Joode, chief security officer at hosting firm Leaseweb.

Like many other web firms, Leaseweb processes images being uploaded to ensure they are not known illegal images.

"We host a lot of user-generated content sites," said Mr De Joode. "These sites are being abused by people uploading child porn and it's something that our customers do not want."

Abusers needed places to post images, he said, because many of the other places they met online were text-only chat forums where only links, rather than images, could be posted.

Eyes right

Lists of hashes help the sorting process but their usefulness is limited by the changes regularly made to images.

An image-processing tool that can ignore those tiny changes and work out what other images it resembles has been developed by Microsoft researchers.

Instead of a hash, this creates what its creators call a "signature" for each image. Unlike a hash this signature does not change when an image is altered or manipulated.

"No matter how much it's changed, the underlying properties of the image's signature remain the same," said Stuart Aston, chief security officer at Microsoft UK.

Called PhotoDNA, the tool was developed to keep an eye on images uploaded to other Microsoft services and Facebook and now, with the help of Swedish firm NetClean, is being given to police forces to help them categorise images.

NetClean has a long history of helping police forces process images in a bid to spot illegal material and catch abusers.

Image caption Human experience can prove better than technology in some investigations

Mr Langford said PhotoDNA would help with some aspects of image classification, and could speed up image triage so investigators had more time to spend on other tasks.

But, he said, there were times when there was no substitute for people looking through images to see what they could find.

"The most effective tool is experience," said Mr Langford. "Analysts are skilled at spotting things that the software might not be able to recognise."

Mr Moran from Interpol agreed, saying it was often only because experienced investigators worked on images that lives were saved and children were rescued from abuse.

When they look at images, investigators look for clues that reveal where they were taken. Food packets, plug sockets and household objects can all help police home in on a location.

In one recent case, said Mr Moran, the key object was a doll held by a child. Interpol shared the images - seized in Boston - via its connections to police forces in Europe because it was clear they pointed to a group of abusers never seen before.

The image was seen by a Dutch police officer who knew that the doll was only available in Holland. It was shown on the Dutch version of Crimewatch, with anyone who knew the child asked to come forward.

The shocked parents of the child contacted police, wondering where the image came from. Investigation showed that the image was taken at the child's creche. The child was one of 87 being abused after being left for the day at one of several creches in Amsterdam.

The investigation helped police arrest 14 people who had either abused the children or taken photographs and shared them.

Technology and people can work together to catch abusers, said Mr Moran, but it would take more to stop the online trade altogether.

"As a result of the internet, latent paedophiles and people who could not care less are being exposed to child abuse material," he said. "Because of that, more children are being abused to produce material that can be put online to feed the desire for more material."

"We cannot police our way out of this," he said. "This is a social issue not a police issue."

More on this story

Related Internet links

The BBC is not responsible for the content of external Internet sites