A world leader in reducing the availability of child sexual abuse material
What is Project Arachnid?
Project Arachnid is an innovative, victim-centric set of tools to combat the growing proliferation of child sexual abuse material (CSAM) on the internet. Launched in 2017, Project Arachnid unifies automated CSAM detection methods with a team of dedicated analysts around the world to quickly send removal notices to electronic service providers (ESPs).
Project Arachnid does not use or rely upon facial recognition technology. It uses hashing technology — which is technology that assists in matching a particular image or video against a database of known CSAM. Hashing technology can either be exact (one image is exactly the same as another), or it can be a close match (it might be a resized image, for example). Close matches are obtained by using “perceptual hashing technology” or Microsoft PhotoDNA software.
Explore: How Does Project Arachnid Work?
Project Arachnid: Power of Arachnid
No Cost Tool for Industry: Shield by Project Arachnid
Project Arachnid makes an API available for companies that want to put a serious effort towards preventing this material from being posted and distributed on their systems.
Shield by Project Arachnid has been developed for use by ESPs to assist with the proactive detection of known CSAM and harmful/abusive images of children. There is no cost to using this tool which allows either content administrators or hosting providers to proactively compare incoming or existing media on their service against Project Arachnid’s list of digital fingerprints. Shield by Project Arachnid can be used as part of an ESP’s overall content moderation strategy to improve upon and accelerate the detection and removal of CSAM and harmful/abusive content.
Project Arachnid detects content at a rapid pace and offers a global solution for disrupting the distribution of CSAM and harmful/abusive material involving children. This system has been significantly enriched by collaborating with hotlines and child protection organizations around the world.
In 2017, C3P created the Arachnid Orb—a device that allows other international hotline and child protection organizations to work collaboratively within Project Arachnid. Since then, 12 hotlines and child protection organizations in 11 different countries have joined our global network by classifying suspect images/videos detected by Project Arachnid for the purpose of issuing removal notices to providers.
Global hotlines and child-protection organizations working in Project Arachnid to classify material
Working in Project Arachnid worldwide
- As of September 1, 2022:
- 153 billion+ images processed
- 48 million+ suspect media triggered for analyst review
- 14 million+ takedown notices issued
Explore: What Does Project Arachnid Send Notices On?
An increasing number of youth/survivors are reaching out for assistance by having Project Arachnid detect their images/videos and issue removal requests on their behalf. Many cite the ongoing anxiety, exhaustion, and trauma associated with trying to self-monitor the public availability of their harmful-abusive material as important considerations in reaching out for support.
We are here to help! If you would like our assistance in contacting a hosting provider to request the removal of your images/videos, please contact us.
Learn more about our additional survivor support services.
C3P is committed to innovative research that looks to better understand the scope of child sexual victimization—online or hands‑on abuse—the systemic failures that allow this to occur, and where the gaps in child protection lie in order to hold industry and government accountable, promote transparency, and assist policymakers in developing effective regulatory frameworks that protect children and support survivors.
Using data from Project Arachnid, C3P authored a first-of-its-kind report which offers a never-before-seen look into the availability of CSAM online and how these companies respond to removal notifications from Project Arachnid.
This is an urgent call to action for governments, industry, and hotlines around the world. Current policies for the removal of CSAM has been focused on determining and removing material deemed illegal under criminal law. In contrast, this framework is grounded in the best interests of the child, and their right to dignity, privacy, and protection from harm.