Announcement: The TCAP’s hashing and hash-sharing capability
We're delighted to announce that the TCAP is now hashing URLs submitted to the TCAP and will begin hash-sharing with the GIFCT Hash-Sharing Consortium.
The TCAP’s hashing and hash-sharing capability to support smaller tech companies to disrupt terrorist use of the internet
Tech Against Terrorism is delighted to announce that the TCAP is now hashing all URLs that contain terrorist content and will be sharing this with the GIFCT’s hash-sharing consortium to further achieve our mission to support smaller tech companies with removing terrorist content.
In collecting data submitted to the TCAP, our developers have generated hashing software which creates a unique digital fingerprint of each piece of terrorist content. The TCAP hashes verified terrorist content helping the tech sector, particularly smaller tech companies, with automated decision making when moderating terrorist content. Additionally, the hashing and archiving of all TCAP content means it can be used for research to further contribute to academic study and policy-making decisions. This addresses a gap highlighted throughout the counterterrorism industry, which is the lack of large datasets which may be used for research.
Figure a) Conversion of files from their original format to a numerical and algebraic format using a hashing function
Hash-sharing will significantly disrupt terrorist use of the internet, specifically terrorist groups’ propaganda dissemination. The GIFCT's hash-sharing database consists of verified images and videos created by terrorist groups. These hashes can then be shared with tech platforms to pre-emptively ban verified content without viewing user data, contributing to automated content moderation, and ensuring the same content cannot be reposted on other platforms.
Whereas the GIFCT’s hash-sharing consortium is aimed at content moderation decisions from large platforms, the TCAP supports the removal of “zero-day content” that is newly created and published on smaller platforms. We will provide smaller platforms with a central and transparent place to host classifiers and hashing systems so that they do not have to install invasive and complex software on their own systems. We will also support hash integration with other technologies such as with infrastructure providers and will ensure vetted users are able to verify hashed content.
One particular concern in the tech sector relates to automated solutions to content moderation resulting in the incorrect removal of legitimate and legal speech, including documentation of war crimes. This problem has the potential to be exacerbated by “content cartels”, where shared hashing databases have huge power to determine what is permissible across the entire internet. Transparency and human verification are vital counterbalances to the danger of these “content cartels”, or arrangements between tech platforms to remove content without adequate oversight.
The TCAP process is carefully designed to address these concerns. Prior to being submitted to the TCAP and hashed, content is verified by our expert open-source intelligence (OSINT) analysts to ensure that it is official content that falls within our inclusion policy. By basing content inclusion on terrorist designation lists of democratic nation states’ and supranational institutions, we ground the TCAP in the rule of law. Each piece of content is verified according to our verification policy, assessing both the source of the content and the content itself to ensure a high level of probability the material was produced by a designated terrorist organisation in scope of the TCAP.
Additionally, the TCAP Transparency Report will reinforce transparency by disclosing statistics such as removal accuracy, which reveals that our alerts to tech companies have never been disputed. However, if a tech company we alert does dispute a piece of content, we will review it and can remove it from the hash-sharing database if it does not meet our criteria for terrorist content. To ensure sufficient oversight, the TCAP will also have an academic advisory board to review our inclusion policy and samples of verified content. Finally, content in the hash-sharing database is not automatically removed but is flagged to tech platforms who are still able to make the final content moderation decision based on their own Terms of Service.