EXCLUSIVE - How technology can help in the fight against technology-facilitated child sexual exploitation
63% of child sex trafficking survivors were advertised online at some point during their trafficking situation. 25,000,000 child sexual abuse images reviewed by the National Center for Missing & Exploited Children (NCMEC) annually. 57,335 URLs were confirmed by the Internet Watch Foundation to contain child sexual abuse imagery in 2016.
These are some of the statistics shared on the website of Thorn, an international anti-human trafficking organisation that works to address the sexual exploitation of children, using technology.
Technology has transformed individual lives and human society in many ways, mostly for good. But there is a dark side. Jim Pitkow, Chair Technical Task Force at Thorn, said in his presentation at INTERPOL World, that technology did not create child sex abuse. But technology democratised it.
Thirty years ago, an act of child sex abuse involved just a few people. If the perpetrators tried to memorialise the act, it would be through pictures, and they would have to develop the pictures themselves. In order to disseminate that information to a broader community, they would have to physically ship it or distribute it.
This was before the web, which came in the early 90s. This was before peer-to-peer (P2P) networks which appeared in the late 90s. This was before social media platforms, which emerged in the mid-2000s. It was before the dark web came into existence, where people could people could enter into transactions and route traffic anonymously.
Today, there are entire online communities that are dedicated towards the production, generation and distribution of child pornography. They reach an audience of millions. For every one of those images or videos which gets shared, the child gets re-victimised.
But technology itself might offer solutions. That is what Thorn is exploring.
“Our main focus is to look at ways that we can deter predators. We do that by intercepting their behaviours online. We also look at ways to create a hostile environment for these activities to occur. We seek to reduce the time it takes to find victims,” Mr. Pitkow said.
Dark Web, Deep Web and Open Web
There are three layers of the web which have to be considered here, the open web, the deep web and the dark web. The open web is what we use everyday, it is what you are using while reading this article. It is the part of the Internet which can be indexed by search engines. The deep web is the part which cannot be indexed. It is estimated to be around 500 times the size of the visible Web.
The Dark Web is a subset of the Deep Web that has been intentionally hidden and is inaccessible through standard web browsers. It can be accessed only with specific software, configurations, or authorisation, often using non-standard communications protocols and ports. We asked Mr. Pitkow during a chat after his presentation, if it takes any technical sophistication to access the Dark Web. Not anymore, he replied. Anyone can do it.
Often the worst of the worst child pornography content is generated in the Dark Web. There are potentially hundreds of sites with hundreds of thousands participants on these sites in the Dark Web. This content then often gets pushed out through sharing on P2P networks on the deep web. The Child Rescue Coalition has identified more than 44 million unique IP addresses on the deep web involved in these activities.
Often, the content then reaches the Open Web. During the first 5 months of 2016, NCMEC received over 5 million reports regarding more than 10 million files. This is a near doubling of the pace from 2016, when there were over 8 million reports and 13 million files for the entire year.
While content on the Open Web is easy to track, investigating the Dark Web poses a number of challenges, in addition to the difficulty of identifying new sites or knowing ‘who’ is ‘who’ due to the inherent anonymity of the Dark Web. Files get posted for just a few days, or sometimes hours. Someone produces new content, puts up a site, the content gets shared and the site goes down. There is no historical data, making it difficult to say if this is a child who is being newly victimised.
Technology tools and their uptake/ impact
Solis- Peering behind the Dark Web
Thorn developed this tool to distinguish between prior child sexual abuse from new cases of victimisation. The second is to figure out how offline material or material on the Open Web ties into the Dark Web and tracking how the content moves around. As mentioned above, a lot of material follows the path of Dark Web to Deep Web to Open Web.
While its believe that there are hundreds of abuse web sites on the Dark Web, Thorn decided to focus initially on the top 17 sites. Within a few months of operations, Thorn had identified over 5.5 million files, representing over half a million unique images, uploaded by over 300,000 users.. “We realised that the problem was a lot larger than what we thought at the outset,” Mr. Pitkow said.
This tool is still in beta but it has already been used by law enforcement agents internationally by 56 investigators across 13 countries, leading to the successful identification and rescue of over 60 children.
We asked about the technology involved in differentiating old and new content. Mr. Pitkow talked about Project Vic in response. Project Vic, supported by Thorn through funding and advisorship, is a collaborative effort between FBI, the Department of Homeland Security, the International Centre for Missing and Exploited Children, the Internet Crimes Against Children Task Force and law enforcement around the world to create a central repository of child sexual abuse image hashes.
Hashes serve as fingerprints of the photographs. An algorithm takes data on characteristics of the photographs and encrypts it into a string. It is infeasible to invert the cryptographic hash function. It would be illegal for Thorn to view the photographs or to store them on their servers. Instead the hashes or the fingerprints are stored and shared.
“It uses photo fingerprinting technology, extracting the features of the photograph and putting that into a signature. So that you can compare two photographs and say if this the same photograph. And it is robust against distortion, exchanges, modifications,” Mr. Pitkow explained.
Project Vic allows law enforcement to automatically check if the photographs on say a seized hard drive, against the database. This reduces the time taken to analyse seized content collections, helping law enforcement to more rapidly focus on new content and possible victims.
Spotlight – Finding the ones hidden in plain sight
Spotlight looks at escort sites, places where people go to exchange sexual services.
“The tool tries to find children that are used in those activities. That turns out to be a very difficult problem. It’s very difficult to determine somebody’s age from a photo or a posting. We sat down with law enforcement and we asked them how they go about solving this problem. They said that they would work through a bunch of ads and they would use their best guesses for who they thought was underage. They would respond to ads, go to a hotel room, see who showed up and then ask for ID,” Mr. Pitkow explained the challenge.
Thorn applied machine learning algorithms to solve this challenge of finding the advertisements on these websites which involve children. It was challenging because in machine learning you need a truth set to train the system. It took over a year to gather an initial truth set to begin testing and development.
But now Spotlight is able to use this learning to help identify which ads involve children. Spotlight is helping law enforcement identify on average 5 kids per day and the tool has enabled a 60% reduction in time it takes law enforcement to investigate successfully these cases.
Thorn’s website states that the web-based tool is now used by over 4,000 officers in all 50 states of the United States. To date, over 6000 sex trafficking victims and more than 2000 traffickers have been identified using Spotlight.
Mr. Pitkow said that Spotlight also enables tracking the victims, “Kids are trafficked. In the US, we saw that kids would be in one city for a week, another city for another week. They just move around from state to state. But they tend to use the same images, tend to use the same phone numbers. And our tool is specifically geared towards monitoring that.”
Industry Hash Sharing platform – Detect once, remove everywhere
Earlier, online social media and image sharing platforms were screening independently for child sexual abuse images. If an image was found on one platform, that company would report it but it would continue to remain on others. Mr. Pitkow explained, “There’s no law which says you have to proactively scan for this. It only says that once you do find it, you have to report it. Doesn’t say you have to report it to your competitor.”
“We got these folks into a room and said can we all agree that if you find that image once, then it should just be removed from the internet, as broadly as we can? It’s in everyone's best interest to not have this content, anywhere on anybody’s platform. They said we agree,” he said describing the genesis of the industry hash sharing platform, a cloud-based hash sharing tool.
Thorn helped prototype it, in collaboration with industry partners, so that once if one platform had images classified and sent to NCMEC, they also get sent into this shared hash. That hash gets redistributed such that others can check it against their existing corpus of photos or against any new photos that get uploaded, without it having to be reported for abuse on that platform.
In 2014, the Shared Hash approach was integrated into NCMEC’s cybertipline reporting system. The companies were already submitting their images to NCMEC, which acts as a clearing house. This initiative tied in reporting with enforcement.
Here, Thorn works with search engine partners and based on the distinctive terms that are used for seeking out child pornography and for child sex trafficking, advertisements are shown to deter the individuals. Over 2 million have been reached with deterrence ads, and importantly, 60 thousand of those people sought help.
Child Finder Service
Child Finder is another tool being developed by Thorn. It is looking at facial recognition technology, to answer the next question: if this is a new image and it has a child, who is it? Can we match that image to something else that’s out there?
Mr. Pitkow said, “You might have this missing child, and somewhere across one of these other layers of the Internet, there is an image of the child being sexually abused. Before there was no way to connect those dots, because of the sheer volume of material. Especially if that child was abducted and taken somewhere else. Because the person investigating that crime wouldn’t know of the other stuff over here.”
When asked about the next steps, Mr. Pitkow said that Thorn will continue to expand its footprint globally and explore new technology-driven solutions to address the problem of child sex trafficking and abuse. “We are encouraged by our initial successes, by the increased engagement with law enforcement across different set of tools. We are excited to be expanding across different jurisdictions, across the world. We are excited about new technologies like facial recognition,” he said.