Help us make the FRA website better for you!

Take part in a one-to-one session and help us improve the FRA website. It will take about 30 minutes of your time.

YES, I AM INTERESTED NO, I AM NOT INTERESTED

AdobeStock#324023974
17
November
2025

Regulating online terrorist content – Balancing public safety and fundamental rights

Online terrorist content is a threat to fundamental rights, rule of law and democracy. EU measures to tackle such content aim to prevent terrorism while upholding these values. FRA’s report looks at how online terrorist content is detected and removed under EU legislation. It highlights challenges in interpreting rules, risks of over-removal and potential impacts on freedom of expression. It finds that moderation practices by authorities and platforms can disproportionately affect certain groups, such as Muslims and Arabic speakers, while far-right content often receives less scrutiny. The findings, based on research and expert interviews with those addressing online terrorist content, offer ways to improve transparency in content moderation and to better balance public security and fundamental rights, contributing to wider debates on regulating online content responsibly.

The key novelty introduced by the regulation is removal orders, which national competent authorities can issue to HSPs requiring them to remove terrorist content or to disable access to it in all Member States (Article 3(1)), as soon as possible and, in any event, within one hour of the receipt of the removal order (Article 3(3)). If a competent authority has not previously issued a removal order to an HSP, that HSP should receive information on the applicable procedures and deadlines, at least 12 hours before issuing the removal order (Article 3(2)).

The rationale behind the one-hour limit is based on the need to counteract the ‘speed at which terrorist content is disseminated across online services’ (Recital 17). The regulation does not envisage the HSPs reviewing the content and potentially objecting to its removal during this one-hour period, unless there are manifest errors or technical issues preventing them from implementing the removal order (Article 3(8)).

Recital 40 also acknowledges the parallel existence of referrals. Referrals are not governed by the regulation but authorities of some Member States (typically ‘internet referral units’ set up within some national law enforcement and counterterrorism authorities based on national legislation) and Europol (the EU Internet Referral Unit, based on the explicit mandate to issue referrals under Article 4(1) of the Europol Regulation [46]
 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53, ELI: http://data.europa.eu/eli/reg/2016/794/oj.
use them to alert HSPs of content that could be considered terrorist content, for the provider’s voluntary consideration of its compatibility with its own terms and conditions. The final decision on whether to remove the content flagged by a referral therefore remains with the HSP. Referrals are described in the regulation as an effective means of increasing HSPs’ awareness of specific content available through their services and enabling them to take swift action. The regulation leaves it up to competent authorities whether to use a removal order or a referral when addressing terrorist content online. The interplay between the use of referrals and removal orders presents an important factor when it comes to the application of the regulation and its impact on fundamental rights.

This chapter addresses the fundamental rights implications of the mandatory nature of removal orders and the one-hour time limit for their execution. It looks at the implications for the rights of content providers and other users, notably their freedom of expression, along with the impact on HSPs and their freedom to conduct a business. Afterwards, it explores the interplay between the use of removal orders and referrals and its implications on fundamental rights.

When it comes to the impact of removal orders on content providers, interview findings reveal that respondents’ concerns relate mostly to freedom of expression and information (Article 11 of the Charter) and the right to an effective remedy and a fair trial (Article 47 of the Charter). Given the risk of over-removal and a potential chilling effect, other impacted rights can include freedom of thought, conscience and religion (Article 10 of the Charter), freedom of assembly and of association (Article 12 of the Charter), freedom of the arts and sciences (Article 13 of the Charter) and non-discrimination (Article 21 of the Charter). Furthermore, the requirements for swift compliance with removal orders set out by the regulation also impact HSPs’ freedom to conduct a business (Article 16 of the Charter), a concern expressed already in relation to the draft regulation in 2018 [47]
 See, for example, EDPS, ‘Formal comments of the EDPS on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online’, 12 February 2019, p. 5; FRA, Proposal for a regulation on preventing the dissemination of terrorist content online and its fundamental rights implications – Opinion of the European Union Agency for Fundamental Rights, Publications Office of the European Union, Luxembourg, 2019, pp. 26–28.
.

The risk that the requirements of the regulation concerning the execution of removal orders can lead to removals of legitimate content and adversely impact the freedom of expression and other rights of content providers and other users is acknowledged by interviewed experts across professional groups. This stems from the combination of two broad factors – the risk that the removal order may be issued on content that is not terrorist content in nature, coupled with the obligation for HSPs to speedily remove flagged content without an opportunity to review it first.

Prior to removal, the regulation leaves determining the terrorist nature of the content squarely in the hands of the competent authorities. It does not envisage the HSP addressed by the order to review the content in question, requiring them to implement it except for narrowly defined circumstances in Article 3(8), none of which relate to the actual nature of the content. HSPs have possibilities to voice their objections – to review the content and decide to contest the removal in court or in front of the authority conducting scrutiny of cross-border removal orders – only once they have complied with the removal order (see Chapter 4[48]
 In France, the rules for implementing removal orders were one of the main concerns raised by civil-society organisations, including La Quadrature du Net, which referred the implementing national legislation to the Council of State (Conseil d’Etat) in November 2023. The application was pending at the time of writing this report.
.

If an HSP considers that the content does not contravene its own terms and conditions, it has the possibility to block it within the EU only. This is important for those HSPs that operate also in other jurisdictions where the same content might be legal. Such a solution, nevertheless, still impacts the rights of content providers and users within the EU.

Experience shared by experts from HSPs shows different approaches by companies in this regard. One testifies that they systematically take down the content first, then conduct their review, and another emphasises that they would simply remove the content within minutes. Another HSP expert, without practical experience with removal orders so far, likewise indicates that they are not expected to review the legality of the order and would tend to trust the assessment of the authorities, which is their default approach to government requests.

Experts from some large HSPs with experience in receiving multiple removal orders, on the other hand, point out that they attempt to assess the content first but that the one-hour period makes it difficult to do so. As an example of practical challenges that were raised, one of these experts says that they frequently need to request clarifications and understand the reasoning behind the order, especially when the content does not appear to be terrorist content in nature, but are mostly forced to remove the content before doing so.

The one-hour removal requirement leaves very little time for review before enforcement. This can have unintended consequences for users’ rights, particularly freedom of expression and access to information, if a takedown request is issued in error. In rare cases, legitimate content may be removed unnecessarily.

HSP expert

Some HSP and civil-society/academia experts highlight that HSPs’ approach to scrutinising removal orders may depend on the business model of the company, the priority attached to combating terrorist content in its own policies, its relationship with the authorities and the degree to which it is willing to go into conflict with them over the fundamental rights of its users.

As some of these experts note, awareness of HSPs also varies significantly, not only about the regulation in general, but particularly about fundamental rights issues, which can be low among smaller HSPs. This might hamper them from assessing the fundamental rights impact of removal orders they receive, not only before but also after executing the order. An expert from a smaller HSP, which has already been subject to removal orders, highlights that the risk of fines might especially demotivate smaller HSPs from questioning the assessment of the authorities.

With smaller providers, this can be an issue. If […] you were going to get a fine because you didn’t act fast enough, then you will obviously actively start to enforce things more on your side – better safe than sorry.

HSP expert

In this context, some interviewees question the balance between the need for urgently removing content and the impact of the one-hour deadline on HSPs, and, through them, on the rights of content providers when it comes to legitimate content that may be erroneously removed.

Some HSP and civil-society/academia experts argue that the importance attached by the regulation to the speed of removal as the main metric of compliance disregards the quality of terrorist content moderation systems implemented by HSPs themselves. These experts recall that the regulation complements HSPs’ own moderation, through which they remove much more terrorist content, and much faster, than competent authorities can flag. Therefore, metrics should take into account these efforts by HSPs and the general quality of their cooperation with authorities, and also the quality of content of removal orders issued to HSPs by competent authorities, which varies and has an impact on the speed at which HSPs can process them.

Some interviewees argue that different types of content might justify different responses. Recalling the Christchurch terrorist attack, it should be noted that interviewees across all professional groups acknowledge the need for speedy takedowns to avoid viral sharing of footage of attacks or terrorist manifestos. Some of them question, however, whether the same urgency applies to all types of content falling within the scope of the regulation, noting that a similar obligation to remove terrorist content within one hour was already invalidated by national courts in the past [49]
 In a decision of 18 June 2020, the French Constitutional Council (Conseil Constitutionnel) invalidated certain provisions of the law aimed at fighting online hate content, known as the Avia Law, stating that they infringed on freedom of speech and communication and are not necessary, appropriate and proportionate to the aim pursued. This included a provision obliging HSPs to remove illegal terrorist content and child sexual abuse material within one hour after the receipt of a notification by an administrative authority.
.

Some civil-society/academia interviewees say that the one-hour timeframe appears particularly stringent when compared with the DSA which covers illegal online content more broadly and takes into account the size of HSPs, establishes an approach based on identifying and countering systemic risks and does not impose strict deadlines.

Experts from civil society / academia also reflect that while the one-hour deadline begins to run after a competent authority flags the content to the HSP (and not from the moment of detection), competent authorities do not always issue removal orders immediately but might instead gather information and then flag multiple pieces of content at once. As a result, HSPs might receive a batch of removal orders and have only one hour to handle them all. This can put a strain on the company’s resources but also raises questions over whether the content indeed requires urgent removal.

Interviewees from competent authorities indicate that while most removal orders are complied with by HSPs (nearly 90 % resulting in the removal of content) and HSPs have been mostly managing to comply with the one-hour rule, it appears to present a challenge for some of them. Some mention a period of up to 24 hours, which, according to some experts, matches the time during which they would expect HSPs to act upon a referral, as acceptable, while others have experienced delays ranging from 15 minutes to a couple of days in a few instances. This variety and degree of lenience likewise indicate that authorities do not consider all content flagged by removal orders to necessarily warrant the urgency implied by the one-hour rule.

As discussed in Chapter 1, many interviewees consider that the definition of terrorist content is not sufficiently clear and foreseeable, which increases the likelihood of removal orders being issued in error.

This risk might be higher for certain types of content, for example based on language or subject matter requiring particular expertise. While most experts from competent authorities state that they have the necessary capacity to deal with languages in which they typically encounter potential terrorist content, many acknowledge that content in particular languages or dialects is difficult to assess, both when it comes to text and audio (like the current trend of using music such as nasheeds [50]
 A nasheed is a form of Islamic vocal music, frequently exploited by terrorist organisations to spread jihadist propaganda. See Europol, European Union Terrorism Situation and Trend Report – 2024, Publications Office of the European Union, Luxembourg, 2024, p. 28.
to disseminate propaganda). They point to the limited usefulness of translation tools and highlight the need to work with specialised translators.

Arabic is given as a prime example of this challenge. On the one hand, the large volume of terrorist content distributed in Arabic makes it highly relevant for authorities who are detecting and assessing potential terrorist content. At the same time, experts from competent authorities and civil society / academia emphasise that Arabic is highly context-dependent, and words and phrases taken out of context could be easily misinterpreted or have different meanings in its various dialects. This can impact the accuracy with which authorities assess content.

Arabic is a difficult language to translate because one word can have a lot of meanings.

Competent authority expert

Interviewees across professional groups also highlight particular challenges when it comes to assessing content related to current events or sensitive political issues. They emphasise the importance of context, saying that a particular piece of content can never be assessed in isolation, something that the regulation expressly recognises in Recital 11. Experts from competent authorities say that when they receive reports of alleged terrorist content from the public and other flaggers, these often relate to such complex topics and need to be very carefully scrutinised as they carry a high risk of disproportionately interfering with freedom of expression and freedom of thought, conscience and religion. This is also why human assessment based on experience is indispensable for their work and cannot be replaced by automated tools, most experts from competent authorities explain.

Recalling that the majority of authorities competent for issuing removal orders are law enforcement or intelligence agencies (see Section 1.2), some experts from civil society / academia express the view that such authorities might be naturally more likely to follow security-led approaches driven by their counterterrorism experience, without necessarily being equipped to fully assess the impact that ordering an expedited removal of content may have on freedom of expression and other rights. In this context, some mention that removal orders can amplify what they describe as inherent risks in counterterrorism, such as a risk that some authorities could use removal orders in accordance with their national priorities, with political motivations influencing what is labelled as terrorism, to target content that is considered undesirable and introduce state censorship over certain topics. Some interviewees warn about the risk of over-policing certain groups, with reference to existing research illustrating how certain communities can be particularly affected by the use of restrictive measures, especially after events such as terrorist attacks [51]
 See, for example, Equality and Human Rights Commission, Choudhury, T. and Fenwick, H., ‘The impact of counter-terrorism measures on Muslim communities’, Equality and Human Rights Commission Research Report series, 2011; Amnesty International, A human rights guide for researching racial and religious discrimination in counter-terrorism in Europe, 2021; European Network Against Racism and Choudhury, T., Suspicion, Discrimination and Surveillance: The impact of counter-terrorism law and policy on radicalised groups at risk of racism in Europe, 2021.
.

If you keep noticing, as a member of a community, that certain expressions of solidarity or because you write in Arabic, are subject to wider restrictions and the community has been reporting those […] it is a very natural reaction you probably step back and restrict your participation in public life.

Civil-society/academia expert

Besides directly impacting the rights of users as providers of removed content, some civil society/academia interviewees express the view that the overuse of removal orders could lead people to abstain from publishing due to a fear of becoming persons of interest for counterterrorism authorities or having their profile and channels of communication blocked by HSPs (see also Section 3.1.3). Such a chilling effect could affect not only providers of content that has been taken down but also other users who see some types of discourse being censored, these civil-society/academia experts say. The risk of a chilling effect in the context of counterterrorism measures is recognised and addressed in various reports and other documents of international bodies [52] See, for example, UN, Human Rights Council, Report of the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, Martin Scheinin, A/HRC/13/37, 28 December 2009; or Council of Europe, Limiting the Use of Criminal Law to Restrict Freedom of Expression: A Guide to Council of Europe standards, September 2025.
.

Similar to Directive (EU) 2017/541, the definition of online terrorist content in the regulation does not distinguish between different types of terrorism, such as – to use the classification applied by Europol – jihadist terrorism, right-wing terrorism, left-wing and anarchist terrorism, ethno-nationalist and separatist terrorism, and other forms [53]
 Europol, European Union Terrorism Situation and Trend Report – 2024, Publications Office of the European Union, Luxembourg, 2024, p. 4.
. At the same time, as indicated in past FRA research, the underlying focus of Directive (EU) 2017/541 and its transposition by many Member States has focused predominantly on jihadism and on the phenomenon of foreign terrorist fighters [54]
 FRA, Directive (EU) 2017/541 on combating terrorism – Impact on fundamental rights and freedoms, Publications Office of the European Union, Luxembourg, 2021, pp. 9, 63–64, 86–87.
. As the findings of this research show, this focus appears to be largely carried over and possibly amplified in the application of the regulation.

The vast majority of content [flagged by authorities to HSPs] is focused on Islamist extremist terrorism […] a lot of law enforcement across Europe is primarily focused on Islamist extremist terrorism.

Civil-society/academia expert

According to data provided by Europol, the vast majority (circa 84 %) of all removal orders issued via PERCI as of March 2025 targeted jihadist content. In comparison, only about 13 % targeted right-wing extremist content [55] Information provided by Europol at FRA’s request on 18 June 2025, covering the period from the start of the operation of PERCI on 3 July 2023 until 31 March 2025.
. Interviews confirm this. While some experts from competent authorities state that they issue removal orders both on jihadist and right-wing (and potentially other) terrorist content, about half of those who have experience in issuing removal orders say they target predominantly or exclusively jihadist content.

Interviewees offer several explanations for this. First, authorities deciding which content should be subject to a removal order are largely guided by international or national lists of dangerous organisations and individuals, such as the EU sanctions list [56]
 Council of the European Union and the European Council, ‘Sanctions against terrorism’, Council of the European Union and the European Council website, 29 July 2025.
, the UN list of designated terrorist groups [57]
 UN, ‘United Nations Security Council consolidated list’, UN Security Council website.
and the US list of foreign terrorist organisations [58]
 US Department of State, ‘Foreign terrorist organizations’, US Department of State website.
. While the definition of a ‘terrorist group’ applied by the regulation is not limited to such ‘listed’ organisations and Recital 11 of the regulation states that a link to the EU list should be an ‘important’ (i.e. not necessarily decisive or definitive) factor, experts from competent authorities testify that content produced by or promoting such organisations – for example Daesh, al-Qaeda or Hamas – can be automatically considered to meet the definition of terrorist content. Experts from civil society / academia generally consider that the focus on listed organisations helps reduce the risk of affecting legitimate forms of expression. However, they highlight that these lists typically heavily focus on jihadist entities, while very few extreme right-wing or other terrorist entities have been designated.

Second, experts from competent authorities indicate that the definition of online terrorist content is better suited to jihadist content and the modus operandi of jihadist groups. Some state that the regulation captures jihadist content more clearly than other types, while others express the view that the regulation is intended to apply exclusively to jihadist content and content related to right-wing, left-wing and other types of terrorism falls outside its scope. Some civil-society/academia experts tend to agree that the definition is easier to apply to jihadist content and less suitable for capturing unaffiliated or unbranded content, while others speak of a certain lack of appetite or at least de-prioritisation by some authorities when it comes to right-wing or left-wing content, although it clearly falls within the scope of the regulation.

Third, more than half of the interviewed experts from competent authorities explicitly state that non-jihadist content is typically more difficult to assess. Right-wing content is often less clear-cut and more borderline, relying on memes and jargon, with right-wing extremists seemingly more aware of how to navigate the line between freedom of expression and illegal content. Based on their own experience of moderating content, an HSP expert likewise considers that terrorist content other than jihadism is more challenging to both detect and correctly assess. Some competent authority experts mention additional challenges when it comes to certain forms of right-wing content, such as antisemitic hate speech, blurred lines between extremism and terrorism and the fact that in certain jurisdictions, right-wing content might be easier to deal with under legislation against discrimination and extremism, not terrorism. Some interviewees across professional groups also mention the dynamic environment of right-wing terrorism, with less clear profiles that mix different ideologies from militant accelerationism and occultism to incels [59]
 Incel (involuntary celibate) refers to a mostly online subculture that has been associated with cases of misogynist attacks. See, for example, European Commission and Radicalisation Awareness Network, Incels: A first scan of the phenomenon (in the EU) and its relevance and challenges for P/CVE, Publications Office of the European Union, Luxembourg, 2021.
(‘pick and choose’ or ‘salad bar’ approach) [60]
 See, for example, in this regard, The International Centre for the Study of Radicalisation, ‘Academic and Practical Research Working Group white paper: Extremism research horizons’, GIFCT, 2021, p. 2.
, along with the phenomenon of lone wolves and the frequent emergence of new groups.

I think [right-wing extremists] are well aware of the limits of the law and what they can do online. And it’s all a joke and it’s all memes […] so, yes, for the extreme right it’s very difficult to prove a terrorist character.

Competent authority expert

The majority of interviewees from civil society / academia flag the overwhelming use of removal orders for jihadist content as posing certain issues. While recognising the threat from jihadist terrorism in the EU (see textbox ‘Threat posed by jihadist and right-wing terrorist content’), some point to the wide proliferation of right-wing content online and the role that such content has played in triggering terrorist attacks. This raises the question whether the de facto focus on jihadism and the limited ability to capture other content is compatible with the aim of effectively addressing the dissemination of diverse terrorist content [61]
 This has been recognised by counterterrorism experts, with some calling for designating more far-right terrorist groups to accurately reflect and respond to the danger stemming from these groups. See, for example, Tech Against Terrorism, Who Designates Terrorism? – The need for legal clarity to moderate terrorist content online, Tech Against Terrorism, 2023, p. 5.
.

On the other hand, the predominant focus on jihadism in the application of the regulation may result in a disproportionate impact on particular groups, notably Muslims and Arabic speakers. In this context, multiple interviewees point to the impact of the situation in Israel and Gaza following the attacks of 7 October 2023, which led to an increase in the volume of removal orders, predominantly targeting pro-Palestinian content. As reported in FRA’s past research on EU counterterrorism legislation [62]
 FRA, Directive (EU) 2017/541 on combating terrorism – Impact on fundamental rights and freedoms, Publications Office of the European Union, Luxembourg, 2021, pp. 8–9.
, such an overfocus associated with one type of terrorism might entail policing of certain content based on its association with a particular religion or language rather than an actual link with terrorism. This raises questions of compatibility with the principle of non-discrimination and freedom of expression and freedom of thought, conscience and religion.

Some experts from competent authorities acknowledge this risk and emphasise that potential jihadist content likewise requires careful assessment. Some offer examples of cases when establishing the terrorist nature of a piece of content required assessing its theological and historical context, interpreting references to the Qur’an and consulting experts on Islam. Posts related to current events are often highly contextual, one competent authority expert says, and authorities must be careful to correctly distinguish between expressions of sympathy with victims of armed conflicts or political views and content that amounts to expressing support for terrorism.

Compliance with removal orders within the one-hour limit envisaged by the regulation can involve substantial investment and changes, especially in the case of smaller HSPs, potentially affecting their business model and, in terms of rights, their freedom to conduct a business. Concerns that the regulation puts a disproportionate administrative burden on smaller HSPs that might be unable to meet its requirements were also raised during parliamentary discussions in some Member States [63]
 See, for example, the lower house of the German parliament (Deutscher Bundestag), Stenografischer Bericht 44. Sitzung, Plenarprotokoll 20/44, 23 June 2022, pp. 4598–4602 (Annex 9); Ministry of the Interior of Bulgaria, Ex-ante impact assessment of the Draft Law amending and supplementing the Ministry of the Interior Act, 15 January 2025.
. Challenges experienced by small and micro-sized HSPs when implementing the regulation and complying with the one-hour rule have also been reported in recent research [64]
 See Tech Against Terrorism Europe, ‘Report: The challenges that small and micro HSPs face in implementing the TCO Regulation’, Tech Against Terrorism Europe website, 8 October 2024.
.

Experts from small-sized HSPs and civil-society/academia experts who have experience in capacity-building work with HSPs describe the one-hour rule as a practical challenge which only large HSPs – that can ensure 365-day, 24/7 availability of their staff – are in a position to meet. These experts report concerns shared by small and medium-sized HSPs about their ability to maintain their businesses, especially if the volume of removal orders increases. During the fieldwork it was stated that very practical factors need to be taken into account, such as the difference in working hours due to operating in a different time zone, and an HSP expert also questioned the application of the same approach to companies regardless of their size.

If you have a certain size of user base, e.g. 20 million, it should be tackled in a different way than […] a startup which has one or two people. […] Everything under 24 hours is very unlikely to be handled by startups.

HSP expert

According to some competent authority experts, responsiveness is not always determined by an HSP’s size. One of these experts highlights that many small-sized HSPs have been very serious in their efforts to comply and to familiarise themselves with the new requirements, whereas even large platforms can have disproportionately small or unprepared teams.

Findings nevertheless show that understanding the requirements of the regulation and preparedness for the possibility of being targeted by removal orders is particularly important for smaller HSPs, and generally for all those having limited experience dealing with terrorist content. In this regard, experts from competent authorities state that they follow the procedure required by Article 3(2), providing those HSPs that have not received a removal order before with information at least 12 hours in advance. Only one mentions not having done so in the past due to the urgency of the removals ordered.

When it comes to more systematic awareness raising (see textbox ‘Promising practice: Supporting HSPs through terrorist content online capacity-building projects), however, small-sized HSPs often show limited interest and some of them appear to actively avoid involvement in such programmes, civil-society/academia experts involved in capacity-building efforts say. This may occur for a variety of reasons, for example because HSPs believe they do not fall within the scope of the regulation or do not consider themselves affected by terrorist content, but possibly also because they fear that participation in these trainings could be perceived by their users or authorities as admitting exposure to terrorist content, these experts say.

Finally, the impact of the regulation on HSPs needs to be seen in the broader context of EU regulatory efforts (and, for some HSPs, regulations emerging outside the EU). In this respect, a number of interviewees emphasise the interplay with the DSA. Some civil-society/academia experts express concerns over certain incoherence between the two frameworks, highlighting the more nuanced requirements and stronger safeguards present in the DSA (see textbox ‘Obligations under the regulation and the DSA’). The prevailing view among civil-society/academia experts, confirmed by some HSP experts, however, is that the DSA has by far eclipsed the regulation in terms of HSP compliance focus, due to its broader scope and more extensive requirements. In this context, an interviewee notes that this focus on the DSA can negatively impact the awareness of the regulation among smaller HSPs.

From a company perspective, there is a lot of legislation fatigue. Companies try to keep up to speed with each of these [pieces of EU legislation]. Even the TCO [regulation], which is a relatively simple piece of legislation, has so many implications, and that is nothing compared to the DSA.

Civil-society/academia expert

Prior to the regulation, referrals served as the main tool used by internet referral units of Member States and Europol to tackle suspected terrorist content. Available statistics show that the availability of removal orders has not changed this, as referrals continue to significantly outnumber them [65]
 In 2023, Member States transmitted 329 removal orders and 35 164 referrals to HSPs through PERCI. While the use of removal orders by Member States has increased in the meantime, referrals still far outnumber them. See Europol: EU Internet Referral Unit, 2023 EU Internet Referral Unit Transparency Report, Publications Office of the European Union, Luxembourg, 2025, p. 10.
.

With the exception of those made by Europol, the use of referrals is based on national law. The legislative proposal put forward by the Commission originally sought to regulate the use of referrals alongside removal orders, prompting questions of accountability for takedowns based on referrals and calls for clear rules distinguishing when to use each tool [66]
 See also FRA, Proposal for a regulation on preventing the dissemination of terrorist content online and its fundamental rights implications – Opinion of the European Union Agency for Fundamental Rights, Publications Office of the European Union, Luxembourg, 2019, pp. 35–37.
. In the adopted text, only removal orders were maintained in the regulation, while referrals remained unregulated.

Findings show that the interplay between the use of removal orders and referrals may have an impact on freedom of expression and information (Article 11 of the Charter) and freedom to conduct a business (Article 16 of the Charter). Furthermore, it relates to the broader transparency issues surrounding the application of the regulation.

There is a considerable diversity of approaches among Member States when it comes to the use of removal orders and referrals. Most continue to rely primarily on referrals and use removal orders only in particular circumstances, typically in cases of urgency (e.g. content posing an imminent threat to life or likely to go viral) or in cases where the HSP in question is known not to respond to referrals. Others use exclusively either referrals or removal orders, while others may use one tool or another depending on the circumstances.

For the Member States where both options are possible, the voluntary cooperation prevails over imposing the law.

Competent authority expert

Findings show that the absence of clarity of what content should be targeted, respectively, by removal orders and referrals, is a major concern among experts from civil society / academia and some HSP experts.

One risk lies in the uneven and distorted application of the regulation, where the same content is dealt with differently. If competent authorities in different Member States apply different tools to the same type of content, this increases the likelihood that some content that is terrorist content remains online while some content that is within the margins of freedom of expression is taken down.

Furthermore, this situation blurs the distinction between the two tools and their purposes – removal orders as enforceable tools through which authorities clearly identify content as illegal and assume responsibility for potential errors, and referrals as a way of flagging content that authorities indicate an HSP should assess against its terms and conditions and, potentially, remove at its own responsibility. Given that referrals are not formal legal requests, there are no safeguards accompanying their use. While the regulation introduces specific provisions to safeguard fundamental rights of content providers and HSPs – including the scrutiny of cross-border removal orders, remedies, information obligation towards content providers, transparency reports and the right to have content reinstated – none of these apply to referrals. For example, individuals whose content is removed following a referral must rely on the HSP’s regular complaint avenues (see Section 3.2.3).

A number of experts from civil society / academia, and some experts from competent authorities, express the view that referrals could be regarded as going against the spirit and underlying purpose of the regulation to establish a mandatory, urgent and transparent process for the removal of online terrorist content. Some experts from competent authorities note the absence of a legal basis for issuing referrals under their national law. Others state that terrorist content online, when encountered, should be speedily removed to prevent dissemination, something that only removal orders can ensure.

Ultimately, if you want to comply with the spirit of the TCO regulation and also with your obligation to have terrorist content removed, removal orders are the way to go – especially if the content is very clear.

Civil-society/academia expert

In contrast, the majority of interviewed experts from competent authorities continue to use referrals as the go-to tool. These findings show a common belief among competent authorities that referrals work well and are a more agile tool than removal orders. These experts report that HSPs usually act upon referrals within an acceptable deadline, generally up to 24 hours. They also require less time and effort than removal orders. This relates to the formal requirements for issuing removal orders, for example a statement of reasons explaining why the content is considered to be terrorist content (Article 3(4)(b)). As stated by one competent authority expert, this can be straightforward for clear-cut content, but other content requires thorough analysis. Moreover, in some Member States, removal orders require the scrutiny or consultation of other authorities or are issued or approved by other bodies, which entails having processes in place and sufficient time. This makes the use of referrals more feasible and attractive for some authorities.

[Removal orders] are a tool we only use from time to time […] We understand it should be like this, given that the referral system generally works really well.

Competent authority expert

Some interviewees from competent authorities also appreciate that referrals can capture a broader range of content. Namely, they can address any type of content incompatible with terms and conditions, leading to a takedown even if the HSP does not assess the content as terrorist content. This again raises questions, however, about the clarity of when referrals should be used and the manner in which they may be interpreted by an HSP.

Interestingly, while some experts from competent authorities and civil society / academia indicate that HSPs have a preference for the legal clarity provided by removal orders, half of the interviewed experts from HSPs, representing companies of different sizes, express a preference towards referrals as a tried-and-tested tool. These experts say that removal orders actually slow down the process due to being more formal than referrals and generally consider them unnecessary. Other HSP experts consider that they increase the risk of over-removal due to the one-hour deadline to remove content.

Issues associated with the interplay between removal orders and referrals are difficult to address due to the lack of comprehensive information about their use, including the types of terrorist content targeted by them. The transparency obligation of Member States under Article 8 of the regulation includes only basic information about removal orders and no information about referrals. While the transparency reports of some Member States provide robust information going beyond the minimum requirements of the regulation (see textbox ‘Promising practice: Supporting transparency through enhanced reporting), others have limited information value.

Both HSP and civil-society/academia interviewees argue that without comparable data about how Member States apply the regulation, it is currently difficult to assess its full impact and understand the expectations of the authorities, for instance when it comes to content related to certain topics or events. The current lack of information limits the accountability for the proper application of the regulation and blurs the responsibility for potential adverse impacts on various fundamental rights, these experts say. Data on referrals and more detailed data on removal orders and their reasoning would allow experts to identify similar cases and detect trends, helping to reveal potential tendencies towards over-removal or politicising content removal. It could also help raise awareness and might motivate HSPs and content providers to challenge and appeal removal orders where relevant, some civil-society/academia experts indicate.

Such a breakdown would also offer much-needed clarity about the use of removal orders on the one hand, and referrals on the other, and their respective use for particular types of terrorist content. As highlighted by multiple civil-society/academia experts, this would be particularly important given the persisting disproportion between the use of referrals and removal orders.

A number of interviewees, including experts from competent authorities, consider that HSPs have become more responsive to referrals since the entry into force of the regulation. While other factors may contribute to this, including the regulation’s broader impact on increasing HSPs’ awareness and improving their content moderation work, interviewees highlight that the threat of obligatory removal orders in combination with penalties acts as a strong incentive. In fact, several experts from competent authorities highlight this as a benefit of such a hybrid system, where referrals are favoured but another, more severe, instrument is known to be available. One such expert states that this is how they perceive the aim of the regulation – incentivising HSPs to remove content already upon receiving a referral, so that a removal order would not be necessary.

The aim of the regulation is to get HSPs used to removing content already based on a referral, so that a removal order would not have to be used. Experience from other Member States shows that in 90 % of the cases, content is removed, so the whole process seems to work.

Competent authority expert

This interplay between referrals and removal orders, however, can also impact the scrutiny of content by HSPs and, potentially, increase the risk of removal of legitimate content. In accordance with the relevant guidelines of the Council of Europe, public authorities should avoid any activity that exerts pressure on internet intermediaries through non-legal means [67]
 Council of Europe, Appendix to Recommendation CM/Rec(2018)2 of the Committee of Ministers to Member States on the roles and responsibilities of internet intermediaries, 7 March 2018, paragraph 1.1.1.
. The knowledge that a removal order can follow in case the content is not taken down based on a referral may diminish the voluntary nature of referrals and de facto restrict HSPs’ freedom to assess content, also given the potential consequence of being ordered to implement specific measures after receiving two or more removal orders (see Section 3.2).

Some HSPs may be inclined to trust that authorities have done their due diligence and remove content rather than review it, some experts from civil society / academia and also competent authorities say. This might particularly be the case for HSPs that lack large moderation teams or specific subject matter experts on terrorism. It may also pressure HSPs to process referrals quicker and further reduce the time available for reviewing the content, some civil-society/academia experts warn, pointing to the fact that HSPs typically process all law enforcement requests through expedited channels. Considering the volume of referrals issued by competent authorities in comparison with that of removal orders, this reliance on referrals, combined with the threat of removal orders, can therefore impact the rights of users and companies.

Now that you have the threat of removal orders in the background […] platforms are a lot more likely to listen to the informal referral first. The threat of formal orders and the threat of sanctions has actually created a lot more ‘voluntariness.’ […] It’s voluntariness in a certain sense because you’ve got the big stick being waved if someone doesn’t voluntarily cooperate.

Civil-society/academia expert

In this context, some experts from civil society / academia comment more broadly on referrals as a tool transferring responsibility from competent authorities to HSPs, bypassing the accountability of competent authorities that initiated the process. These experts express doubts about whether private companies are better positioned to assess potential terrorist content than public authorities, citing several reasons. First, HSP terms and conditions are not subject to the same requirements as national legislation and lack transparency, legal clarity and fundamental rights standards. They also vary considerably among platforms. Second, HSPs might not necessarily have the necessary expertise and training to assess the terrorist nature of content and the impact of the takedown on rights. Finally, some civil-society/academia experts emphasise that due to the threats associated with terrorist content, but also due to reputational risks and potential economic consequences, many HSPs tend to review alleged terrorist content less rigorously than other illegal content, and err on the side of over-enforcement rather than risk its under-enforcement.

According to the Commission’s implementation report, by the end of 2023, competent authorities issued removal orders to 13 HSPs [68]
 This includes, in alphabetical order, Archive.org, Catbox, Data Room, FlokiNET, Jumpshare.com, Justpaste.it, Krakenfiles.com, Meta, SoundCloud, Telegram, TikTok, Top4Top.net and X. See European Commission, Report from the Commission to the European Parliament and the Council on the implementation of Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, COM(2024) 64 final of 14 February 2024, p. 5.
. HSP transparency reports indicate that while additional companies have been subject to removal orders throughout 2024, the list remains relatively modest and the majority of orders target a small number of HSPs, focusing on social media platforms [69]
 According to available HSP transparency reports, the highest number of removal orders pursuant to Article 3 of the regulation in 2024 was received by Telegram (requestable and accessible with a Telegram account only), followed by TikTok, Spotify, Facebook, Instagram and X.
.

Findings show that several factors play a role in this regard. Already at the stage of detecting potential terrorist content, authorities are more likely to focus on those HSPs where they know they are more likely to find terrorist content, particularly social media. Some competent authority experts specifically acknowledge that they prioritise larger social media platforms as they offer a ‘bigger pool of fish’ compared to smaller platforms, making the best of their limited resources.

When it comes to deciding between sending a referral and issuing a removal order, referrals continue to be prioritised where authorities and HSPs already have established relationships of effective cooperation in place. In this context, an expert from a competent authority shares an example where an otherwise cooperative HSP expressed concerns over the feasibility of the one-hour rule, which prompted the authority to approach it with referrals first. Another competent authority expert states that referrals also help HSPs to better develop their own content moderation, which is important in view of the vast amount of online content.

However, this approach is not necessarily used across the board. While authorities may send referrals to small or medium-sized HSPs to avoid overrunning their moderation teams, they can opt to send removal orders to bigger HSPs that are deemed more likely to have all the channels and processes in place to implement them.

As some experts from competent authorities and civil society / academia explain, active engagement by HSPs makes it easier for authorities to issue removal orders. Some platforms, on the other hand, hesitate to cooperate with law enforcement or are difficult to reach. This is the case of many HSPs located outside the EU that have not fulfilled the obligation of Article 17 to designate a legal representative. Such HSPs are less likely to respond to removal orders and might require intensive follow-up work. This reduces the incentive for competent authorities to issue removal orders to such HSPs in the first place.

Small [HSPs] have no knowledge at all about the regulation, and it’s difficult to reach them because you have no awareness of their existence at all. […] It’s difficult to make the internet a safer place when you have no clue who the actors are.

Competent authority expert

More work is also needed on mapping the HSP landscape, even within the EU, some experts from competent authorities acknowledge. Findings show that regulatory bodies that are often also responsible for the implementation of the DSA do not necessarily have an overview of companies qualifying as HSPs under the regulation, which de facto excludes some HSPs from being targeted by removal orders or referrals.

When you look at the list of platforms that have received removal orders, […] they were mostly big platforms or what I would call the ‘usual suspects.’ […] What I would see as quite a glaring gap in the enforcement is smaller platforms that are less well known but actually are hosting vast amounts of content.

Civil-society/academia expert

While these approaches might reflect the working methods of competent authorities and allow them to leverage their resources, they also leave room for arbitrariness and raise questions of proportionality and transparency. Freedom to conduct a business for certain HSPs may be affected more than others and the legal provisions of the regulation, including its safeguards, may be applied differently. Furthermore, the focus on certain HSPs risks diverting attention away from other, less obvious yet relevant platforms, potentially negatively affecting the effectiveness of efforts to address the dissemination of terrorist content online.