Empowering Law Enforcement with AI Tools to Stop Child Exploitation and Safeguard Investigators
JSI has partnered with the Internet Watch Foundation (IWF) to combat the spread of child sexual abuse material (CSAM) online. As a new member, we are integrating the IWF’s Hash List into our 4Sight platform. This collaboration will provide law enforcement with even better tools to detect, remove, and investigate CSAM, helping protect vulnerable individuals and hold more offenders accountable.
The Growing Problem of CSAM
The rise of the Internet in the late ’90s and early 2000s removed barriers to accessing and sharing CSAM, enabling its rapid proliferation worldwide. In 2023 alone, the IWF received over 392,000 reports suspected to contain child sexual abuse imagery – a 5% increase from 2022. Today, the true scale of CSAM files is difficult to quantify, with global estimates exceeding 100 million, only accounting for reported cases.
To address the overwhelming volume of CSAM, hash matching technology plays a crucial role. It works by assigning images and videos a unique digital fingerprint – a hash – and then compares it against a database of known CSAM hashes. When an online platform accesses these databases, any matched files are flagged, removed, and blocked from further upload.
Hash Matching in the 4Sight platform
When investigative data is collected into 4Sight, every image or video will be automatically compared against the IWF Hash List, which contains over 2 million monitored CSAM hashes. When a match is detected, the content is flagged for investigator review, ensuring that harmful content is identified quickly and accurately. This process not only aids ongoing investigations by providing critical context, but also helps curb the global dissemination of CSAM.
4Sight hashes data, compares it to the IWF List, and flags matches
Source image
“The IWF exists to hunt down images and videos of child sexual abuse online and eradicate them. But we cannot do this alone. Because of the vast nature of the internet, we need allies in the international tech industry to help us to disrupt and stop the spread of child sexual abuse material,” said Heidi Kempster, Deputy CEO of the IWF. “JSI’s membership will bolster our mission and ensure that global law enforcement organizations have access to our unique Hash List that IWF analysts have verified as containing child sexual abuse content and thus vital data for criminal investigations. We welcome JSI as a valuable partner in our efforts to protect children around the world.”
Vision Analytics: Expediting Investigations and Enhancing Detection with AI
The integration of the IWF’s Hash List is a key component of 4Sight’s Vision Analytics solutions. Powered by 4Sight’s AI engine, Iris, Vision Analytics will also equip investigators with pretrained detection models that automatically identify and flag images and videos containing nudity. Flagged content will be organized in a dedicated library, assisting in the detection and isolation of potentially unreported CSAM. Investigators can then easily share this content with specialized teams dedicated to combatting child abuse for further examination.
The ability to quickly identify, isolate, and share identifiers of CSAM not only expedites investigations by providing actionable intelligence sooner, but also contributes significantly to the fight against its spread.
“At JSI, we’re committed to equipping law enforcement with the most advanced tools to combat the worst forms of online crime,” said Kevin Atkins, VP of Product at JSI. “Our partnership with the IWF and development of AI solutions marks a significant milestone in this effort, enabling us to support investigators not only in identifying and removing harmful content swiftly but also in protecting their mental well-being. Together, we can make a real difference in the fight against CSAM and build safer online communities worldwide.”
Protecting the Mental Well-Being of Investigators
The psychological toll that exposure to CSAM can take on investigators and analysts is significant. Studies indicate that frequent exposure to CSAM with violence is related to higher rates of post-traumatic stress symptoms (PTSS).
To address this, 4Sight’s Vision Analytics will shield users from harmful material by automatically displaying a lock screen overlay to flagged CSAM and nudity content. This feature prevents excessive visibility during analysis, allowing investigators and analysts to stay focused on their work without unnecessarily harming their mental well-being. Additionally, the content is subjected to stricter permissions, ensuring that only users equipped to handle this type of material can access it.
CSAM and potential nudity automatically locked to reduce user exposure
However, it is important to note that blocking CSAM content from investigators and analysts is not a sufficient harm reduction strategy on its own. It must be coupled with comprehensive approaches, such as agency-level wellness programs and regular updates on case resolutions, which have been shown to improve mental health outcomes. Personal coping strategies, like connecting with colleagues and taking regular breaks, are also vital in mitigating harm.
For more information on 4Sight’s Vision Analytics solutions and our partnership with the IWF, please contact us here.