Comparative Analysis of the TCAP Transparency Report Statistics on Content Collection and Removal

This comparative analysis of Islamist versus far-right terrorist use of the internet across a range of platform types and sizes, is based on the Terrorist Content Analytics Platform (TCAP)’s first Transparency Report.

Executive Summary:

· This comparative analysis of Islamist versus far-right terrorist use of the internet across a range of platform types and sizes, is based on the Terrorist Content Analytics Platform (TCAP)’s first Transparency Report.

· During our reporting period, 1 December 2020 and 30 November 2021, there is a vast discrepancy between ideological grouping in the volume of content collected by the TCAP (18,787 Islamist submissions versus 170 far-right submissions).

· Several factors account for fewer URLs being collected relating to far-right content including online dissemination techniques that generate few URLs, lower official propaganda output, verification difficulties, and lack of timely designation of far-right terrorist groups.

· There is also a significant difference between the removal percentage of Islamist terrorist content versus far-right terrorist content. For Islamist content, this averages 94%, whilst for far-right terrorist content this averages a 50% takedown rate.

· Despite a relatively small sample size, our analysis concludes this lower takedown rate of far-right content may be down to its identifiability to moderators, legal and jurisdictional confusion, and platform removal hesitancy due to libertarian policies on freedom of speech.


The Terrorist Content Analytics Platform (TCAP) Transparency Report, covering the first year of the tool’s development, provides useful and unique data for analysis of terrorist use of the internet. The data enables a comparative analysis of Islamist versus far-right terrorist use of the internet across a range of platform types and sizes.

It is important to clarify that this data only includes ‘official’ material from designated terrorist entities within the scope of the TCAP Inclusion Policy. During the reporting period between 1 December 2020 and 30 November 2021, the TCAP submitted 18,958 URLs containing terrorist content and sent 11,074 alerts to 65 tech companies. 94% of this content is now offline.

The first and most obvious observation is the vast discrepancy between ideological grouping in the volume of content collected by the TCAP. In terms of TCAP submissions, 18,787 URLs containing Islamist content were submitted compared to 170 URLs containing far-right content. Meanwhile, 10,959 alerts containing Islamist content were sent to tech companies, versus just 115 alerts containing far-right content.

Breakdown of TCAP submissions and alerts by the two terrorist group types

This discrepancy in volume of content collected by the TCAP requires closer analysis of far-right versus violent Islamist online dissemination techniques. First, violent Islamist terrorist groups in scope of the TCAP often disseminate each piece of propaganda content (e.g. a video) with large lists of URLs that link to multiple file-sharing platforms. This dissemination technique makes the content easy to locate and to verify as official content as it is often

disseminated from ‘beacon channels’.1 This is very different from far-right terrorist groups, who often post propaganda and other material in-app, without sharing it in URL version. This means, for example, that the TCAP might collect 100 URLs for one video produced by Islamic State and promoted by its supporter networks, but only one or two URLs for a single video produced by a far-right group such as National Socialist Order.

A second important factor is simply that the far-right terrorist groups included within the TCAP generally do not produce as large a volume of official propaganda as the violent Islamist groups in scope. Both al-Qaeda and Islamic State have sophisticated propaganda networks with various media outlets for official regional affiliates to produce and disseminate large volumes of content on a regular basis. On the other hand, the far-right groups in scope of the TCAP generally have less organisational capacity and fewer members meaning they cannot produce official content to the same scale as violent Islamist groups.

An additional factor is the difficulties in verifying official content, particularly relating to the violent far-right. As mentioned, Islamist content is often disseminated through stable official beacon channels and can be verified due to the source, branding and other aesthetic features of official content with the associated media outlet. In contrast, a significant volume of violent far-right content is not branded but will be supporter-generated and therefore not currently in scope of the TCAP, praising groups or individuals within scope through more subtle or coded messaging. There is also no centralised and stable source of far-right content from groups in scope. Official channels are regularly removed and reappear, with affiliated official content migrating across a wide range of channels and platforms.

Finally, there are currently fewer far-right groups in scope of the TCAP than Islamist ones due to shortcomings in designation processes. Given the TCAP heavily relies on legal designation of terrorist entities by national governments and supranational institutions, certain violent far-right groups and their content are not currently in scope due to their lack of legal designation. Designation has also struggled to keep up with the fluent nature of violent far-right groups, meaning groups which have now disbanded or changed name (such as National Action) in scope may no longer be producing new content.

Comparative analysis of content removal

TCAP data relating to takedown rates facilitates an analysis of comparative tech platform responses to our alerts by group ideology. There is a significant difference between the removal percentage of Islamist terrorist content versus far-right terrorist content. For Islamist content, this averages 94%, whilst for far-right terrorist content this averages a 50% takedown rate.

Content offline per Islamist group

Content offline per far right entity

It must first be noted that far-right content alerts represented a relatively small sample size (115) compared to Islamist content alerts (11,074) and were not alerted at all on certain platform types (file-sharing, link shortener, paste sites etc.). Despite this, on platform types where the majority of far-right content was alerted (80% on archiving and video sharing), the percentage of that content now offline was significantly lower than for Islamist content.

An explanation for this may relate back to how identifiable the content is to tech platforms. Content produced by Islamist groups in scope tends to be branded and easily recognisable whereas often far-right content may contain symbols of designated groups which are well-known. Detecting symbols may be difficult for tech company moderators, especially small tech platforms’ who may not have the resources to do so. This is partly why Tech Against Terrorism built the Knowledge Sharing Platform (KSP), which contains a symbol compendium to help identify terrorist content. Register for access here!

Furthermore, due to the historical prioritisation of violent Islamist terrorism in counterterrorism policies the far-right has often been overlooked resulting in confusion over the legality of this content for tech platforms. Tech companies may have to catch-up on their knowledge and capacity in detecting and moderating far-right content. If you are such a tech company, please get in touch with us at [email protected], where we will be able to support you through our mentorship programme.

Additionally, much of the far-right content alerted through the TCAP is hosted on alt-tech video-sharing platforms. These platforms have a higher threshold for content removal due to libertarian policies on freedom of speech in line with US laws, possibly explaining lower takedown rates of TCAP URLs.

There are also jurisdictional gaps, for instance content supporting a UK-proscribed group may be illegal there but not in the US where the tech company is based and whose laws focus on incitement to violence. Therefore, tech companies may only ban content in particular jurisdiction, meaning it will still be accessible in other countries, or with the use of a VPN.

Alerts and takedown rates per platform type and ideology of the terrorist entity creating the content


1 Beacons act as centrally located lighthouses that signpost viewers to where content may be found, which is often done through outlinks posting to content stores. Terrorists and violent extremists often use these beacon platforms and have official channels on them that signify their central communications.