Australia’s eSafety Commissioner has recently signed on to a pilot that is designed to reduce the availability of child sexual abuse material online.
According to a recent press release, Project Arachnid is an innovative tech platform based at the Canadian Centre for Child Protection.
It autonomously detects child sexual abuse material at a rate much faster than human analysts are capable of.
However, human analysis is still required to help classify images and confirm quality of data.
The commitment of eSafety to Project Arachnid means that its Cyber Report team will work collaboratively with investigators and analysts across the globe.
Doing so will scale up the capacity and impact of the project in identifying and removing child sexual abuse material from the internet.
The eSafety Commissioner explained that through this work, the Cyber Report team will make a significant impact in restricting the availability of child sexual abuse material to those who are seeking and distributing it.
Analysts from the Cyber Report team will help classify images detected by the Arachnid crawler, contributing to the international effort to build a comprehensive central database of child sexual abuse material ‘hashes’, or digital fingerprints.
Moreover, Cyber Report will gain access to the Arachnid Hash List of known child sexual abuse material, reducing the exposure of investigators to harmful content and improving the welfare of staff.
Additionally, this pilot will allow the investigators to be exposed to less harmful content through information sharing. It will thereby reduce the impact of the incredibly important work that they do.
There is an important role to play in Australia, but ultimately, this is a global problem requiring a global solution.
The Agency is proud to be partnering with likeminded agencies around the world to fight this scourge.
What is Project Arachnid?
Project Arachnid is a technological tool designed to reduce the availability of child sexual abuse images online and help break the cycle of abuse experienced by survivors.
It discovers discovers child sexual abuse material (CSAM) by crawling URLs across the clear web known to have previously hosted CSAM.
The platform determines that a particular URL contains CSAM by comparing the media displayed on the URL to a database of known signatures that have been assessed by analysts as CSAM.
If CSAM is detected, a notice is sent to the hosting provider requesting its removal.
Estimates vary as to the quantity of child sexual abuse material available online. The United Nations (UN) states that approximately 750,000 people are accessing such material at any given moment.
Every month, Project Arachnid detects more than 500,000 unique images of suspected child sexual abuse material requiring analyst assessment.
To date, Project Arachnid has sent more than 1.6 million notices for removal of child sexual abuse material to online providers.