Help us make the FRA website better for you!

Take part in a one-to-one session and help us improve the FRA website. It will take about 30 minutes of your time.

YES, I AM INTERESTED NO, I AM NOT INTERESTED

AdobeStock#324023974
17
November
2025

Regulating online terrorist content – Balancing public safety and fundamental rights

Online terrorist content is a threat to fundamental rights, rule of law and democracy. EU measures to tackle such content aim to prevent terrorism while upholding these values. FRA’s report looks at how online terrorist content is detected and removed under EU legislation. It highlights challenges in interpreting rules, risks of over-removal and potential impacts on freedom of expression. It finds that moderation practices by authorities and platforms can disproportionately affect certain groups, such as Muslims and Arabic speakers, while far-right content often receives less scrutiny. The findings, based on research and expert interviews with those addressing online terrorist content, offer ways to improve transparency in content moderation and to better balance public security and fundamental rights, contributing to wider debates on regulating online content responsibly.

To define online terrorist content, the regulation relies on existing definitions of ‘terrorist offences’ and ‘terrorist group’ in Directive (EU) 2017/541 on combating terrorism (see textbox ‘Definitions of terrorist offences and a terrorist group in EU law’). Article 2(7) of the regulation encompasses content that incites or solicits the commission of one of the terrorist offences established by Directive (EU) 2017/541 or constitutes a threat to commit one of the offences. Material that solicits participation in the activities of a terrorist group also falls under this definition. Finally, material that provides instruction on the making or use of explosives, firearms or other weapons or noxious or hazardous substances, or on other specific methods or techniques for the purpose of committing or contributing to the commission of one of the offences, is likewise considered terrorist content.

Concerning incitement of the commission of terrorist offences, the regulation incorporates the definition of public provocation to commit a terrorist offence present in Article 5 of Directive (EU) 2017/541, including content which indirectly, such as by the glorification of terrorist acts, advocates the commission of terrorist offences.

The regulation only applies to content that is ‘disseminated to the public’, i.e. made available to a potentially unlimited number of persons (Article 2(3) of the regulation). As a result, content shared privately, for example via messaging applications, is exempt from its scope.

Furthermore, Article 1(3) – similarly to Recital 40 of Directive (EU) 2017/541 – specifies that material shared with the public for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering terrorism, including material expressing polemic or controversial opinions within public debate, will not be classified as terrorist content, and that an assessment will determine whether material is disseminated for these purposes.

Findings related to the definition of terrorist content online have implications for the application of the regulation as a whole, including its key provisions on issuing removal orders, scrutiny of cross-border removal orders, the application of specific measures and penalties, and access to remedies, as examined in Chapters 2, 3 and 4. Besides the general principles of legal clarity and foreseeability, these issues impact, in particular, freedom of expression and information (Article 11 of the Charter). In addition, they can have an impact on a variety of other rights, including but not limited to freedom of thought, conscience and religion (Article 10 of the Charter), freedom of assembly and of association (Article 12 of the Charter), freedom of the arts and sciences (Article 13 of the Charter), freedom to conduct a business (Article 16 of the Charter), non-discrimination (Article 21 of the Charter) and the right to an effective remedy (Article 47 of the Charter).

This chapter covers respondents’ experiences and views as regards the impact on fundamental rights stemming from the definition of terrorist content in the regulation and its application by competent authorities in practice. It focuses first on the suitability of the definition – its legal clarity and foreseeability. Then it discusses how the diversity in competent authorities applying the definition affects these challenges. The chapter subsequently zooms in on the interplay between the definition and the risk of over-removal when it comes to educational, journalistic, artistic and research-related content.

The findings, based on the views of interviewed experts across professional groups, show that the definition of online terrorist content presents one of the main challenges when it comes to the fundamental rights impact of the regulation. The clarity of the definition and foreseeability of its use are essential both to ensure uniform application of the regulation across the EU and to avoid unintended risks for a variety of rights.

During the negotiations on the proposed regulation, the definition was subject to considerable discussion. In a joint communication, three UN special rapporteurs expressed concerns that the regulation would go beyond content that is criminal in nature and called to ensure that the definition of terrorist content is narrowly construed to guarantee that measures taken pursuant to it do not unduly interfere with human rights [31]
 See UN, Mandates of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression; the Special Rapporteur on the right to privacy and the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, OL OTH 71/2018, 2018, pp. 2–5. See also Meijers Committee, ‘CM1904 Comments on the proposal for a regulation on preventing the dissemination of terrorist content online (COM(2018) 640 final)’, 2019, pp. 1–2; European Digital Rights, ‘Letter to Member States calls for safeguards in terrorist content regulation’, European Digital Rights website, 16 December 2019.
. Some of these concerns were reiterated during the parliamentary and expert discussions on national legislation implementing the regulation in a number of Member States [32]
 See, for example, In-Cyprus, ‘New bill aims to crack down on online “terrorist content”’, In-Cyprus website, 17 May 2024; the lower house of the German parliament (Deutscher Bundestag), Stenografischer Bericht 44. Sitzung, Plenarprotokoll 20/44, 23 June 2022, pp. 4598–4602 (Annex 9); Estonian Parliament (Riigikogu), Protocol of the session on 13 December 2023; Hellenic Parliament (Βουλή των Ελλήνων), Minutes of K’ Period, Plenary Session NB’, (Πρακτικά Ολομέλειας, Κ’ Περίοδος, Σύνοδος Α’, Συνεδρίαση ΝΒ’), 16 November 2023, p. 5605 et seq.; Arangüena Fanego, C., ‘New steps against terrorism in the EU: Regulation (EU) 2021/784 and orders for the removal of terrorist content online’, Revista de Estudios Europeos, 2023; French National Assembly (Assemblée nationale), ‘Referral to the Constitutional Council by 60 deputies, No 2022-841 DC received by the court registry of the Constitutional Council’, (‘Saisine du Conseil constitutionnel par 60 députés, No2022-841 DC reçue par le greffe du Conseil constitutionnel’), 29 July 2022; the Parliament of Luxembourg (Chambre des Députés), ‘Verbatim – Debate on file 8325 – Session of 02.07.2024’ (‘Verbatim – Débat sur le dossier 8325 – Séance du 02.07.2024’), 2 July 2024, p. 3; National Council of the Judiciary of Poland, ‘Opinion on the draft law amending the Anti-Terrorism Act and the Act on the Internal Security Agency and the Intelligence Agency’, 28 November 2024.
. In the adopted text of the regulation, the definition was more closely linked to the criminal law definitions in Directive (EU) 2017/541.

Several interviewees, including experts from competent authorities and civil society / academia, question the overall suitability of the definition. The definitions of terrorist offences in Directive (EU) 2017/541, which the regulation relies on, have been developed for the purpose of criminal proceedings where they are applied by a court based on the combination of objective elements of the crime in question and the perpetrator’s intent. According to these experts, this makes the definitions inherently ill-suited for speedy decision-making in administrative proceedings, which are not accompanied by the same requirements and level of procedural safeguards. From this perspective, a clearer, unambiguous distinction between legal and illegal content would appear more appropriate.

In general, most interviewed experts consider that the definition of online terrorist content in the regulation would benefit from additional clarity and foreseeability. This includes the majority of experts from competent authorities. Some of them highlight that a joint understanding of what types of content should fall under the scope of the regulation and a common baseline for assessing content are still missing.

[The regulation] is a legal text that cannot cover all aspects of what one sees in disseminated content, so you always put your judgement as an expert into it. And […] maybe the same content is viewed a bit differently by different experts from different countries. And then, if it is country A [issuing a removal order] and country B scrutinising it, if this joint understanding, this baseline is not there, then maybe there is an issue.

Competent authority expert

Despite the harmonisation through Directive (EU) 2017/541, differences in the definitions of terrorist offences such as incitement (public provocation) to terrorism and operational realities across Member States give rise to different interpretations of what constitutes terrorist content online. Some experts from this professional group say that clarifying what content falls under the definition requires internal consultations with other institutions or legal experts. Only a minority of experts from competent authorities consider the regulation sufficiently clear, with some adding that they in fact appreciate a broader definition that leaves room for a more flexible application of the regulation within the national context.

Civil-society/academia experts generally perceive the wording of the individual provisions of Article 2(7) as too vague and open to interpretation. Some warn that it is likely to lead to diverging interpretations among Member States but possibly also among practitioners within the same Member State. Besides reducing the foreseeability of how the regulation is applied, it also underlines the need for systematically and rigorously applying the mechanism for scrutiny of cross-border removal orders (see Chapter 4).

Concerns over the clarity of the definition are also raised by experts with direct content moderation perspectives. Some HSP experts and civil-society/academia experts with such experience note that the definition in the regulation is not well-suited to practical use and is difficult to apply in the context of online content. In their view, technical experts dealing with content moderation could have been involved when drafting the regulation to find a definition better suited to the specificities of online content moderation.

Interviewees across professional groups consider that the main challenge is associated with interpreting the concept of incitement to terrorism (Article 2(7)(a)), in particular the inclusion of ‘glorification’ of terrorist acts.

Human rights experts have long been critical of including vague concepts such as ‘glorifying’ or ‘promoting’ terrorism in the definition of incitement (public provocation), arguing that these not only miss the necessary precision but also make it difficult to establish the risk that an actual terrorist offence might be committed as a result (an element required also by Article 2(7)(a) of the regulation) [33]
 See Scheinin, M., UN Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, ‘Ten areas of best practices in countering terrorism’, A/HRC/16/51, 22 December 2010, pp. 15–16.
.

Civil-society and academic experts, in particular, highlight the distinction between glorification and direct incitement as a source of concern. By also including in its scope, in Recital 11, dissemination of ‘material depicting a terrorist attack’, any content that comments on an act of terrorism could be subject to removal on this basis. Others recall that the entire concept of glorification is based on the subjective intention of the author to encourage terrorist activity, which is difficult to assess in the case of online content. As also shown in FRA’s earlier research on terrorism, the link between glorification and intent is so intrinsic that the boundaries between polemic or radical, yet permissible, expression and glorification are often difficult to draw even during criminal proceedings [34]
 FRA, Directive (EU) 2017/541 on combating terrorism – Impact on fundamental rights and freedoms, Publications Office of the European Union, Luxembourg, 2021, pp. 57–66.
. For this reason, some civil-society/academia experts suggest opting for much narrower definitions capturing content which should be objectively subject to removal regardless of its context and intent, for example – as suggested during the fieldwork – decapitation imagery.

The fact that glorification made it to the final text, and that any type of expression that simply comments on a committed terrorist attack or inappropriately criticises public officials or the government, could fall into the scope of glorification […] is an enormous issue.

Civil-society/academia expert

Experts from competent authorities likewise point out issues with the concept of glorification. Pointing to the differences in Member States’ legislation, some state that glorification as referred to in the regulation does not feature under their national law, and they would not be sure how to apply it in case they encountered such content. Others say that the application of this definition is straightforward only if the content relates to organisations clearly recognised as terrorist groups or if it glorifies specific past terrorist attacks or their perpetrators.

These questions surrounding the legal clarity and overall suitability of the definition can have a significant impact on freedom of expression and information, freedom of thought, conscience and religion, and freedom of assembly and association.

The majority of interviewed experts from civil society / academia warn that the definition leaves too much room for subjective assessment and removal of content that is not of a terrorist nature. In this context, some recall that Directive (EU) 2017/541 – despite its Recital 40 aimed at safeguarding the expression of radical, polemic or controversial views – has been subject to critique for being overly broad and over-inclusive, potentially leaving room for application influenced by political considerations and for covering activities that are not of a terrorist nature, such as solidarity movements or environmental activism [35]
 FRA, Directive (EU) 2017/541 on combating terrorism – Impact on fundamental rights and freedoms, Publications Office of the European Union, Luxembourg, 2021, p. 10.
. This conflation of legitimate protest and terrorism, in turn, gives rise to a risk of over-enforcement of the regulation and could result in targeting views critical of the government, experts from civil society / academia warn.

Experts from competent authorities acknowledge the potential impact that the regulation can have, in this respect, on fundamental rights. Some of them expressly state that they are aware of the need to pay close attention to the risk of affecting, in particular, freedom of expression and capturing political opinions, when applying the regulation. According to these experts, the clarity issues associated with the regulation’s definition of terrorist content make them even more cautious and selective when applying it. This leads them to issuing removal orders only on clear-cut cases of terrorist content which can withstand judicial scrutiny.

We act against clear terrorist content. Nobody will be affected in their freedom of expression […]. We do not act against political opinions. […] So, we are only on safe territory.

Competent authority expert

According to Article 12(1), Member States have to designate authorities competent to issue removal orders, scrutinise cross-border removal orders, oversee the implementation of specific measures and impose penalties. Recital 35 requires that competent authorities fulfil their tasks in an objective and non-discriminatory manner. In all other aspects, the regulation leaves the choice of competent authorities to the discretion of Member States.

In practice, this results in a diverse landscape across the EU when it comes to which authorities are responsible for interpreting and applying the definition of what constitutes terrorist content (i.e. issuing removal orders and scrutinising cross-border removal orders issued by other Member States) [36]
 The updated list of national competent authorities and contact points is available on the Commission website.
. Depending on the jurisdiction, this may include law enforcement agencies, intelligence services, public prosecutors, media regulators or other administrative bodies. Only in a small number of Member States [37]
 This is notably the cases in Cyprus (The prevention of the dissemination of terrorist content on the internet Act of 2024 (O περί της Πρόληψης της Διάδοσης Τρομοκρατικού Περιεχομένου στο Διαδίκτυο Νόμος του 2024), Article 3(1)); in Denmark (Act on supplementary provisions to the regulation on handling the dissemination of terrorist content online (Lov om supplerende bestemmelser til forordning om håndtering af udbredelsen af terrorrelateret indhold online), Article 4(1)); in Malta (Addressing the dissemination of terrorist content online regulations, Legal Notice 7 of 2023, Article 4(1)); and in Slovenia (Act implementing the Regulation (EU) on addressing the dissemination of terrorist content online (Zakon o izvajanju Uredbe (EU) o obravnavanju razširjanja terorističnih spletnih vsebin (ZIUORTSV)), 23 September 2024, Article 3(1)). In Italy, the preliminary investigations judge is competent for scrutinising cross-border removal orders (see Legislative Decree No 107 of 24 July 2023 on adaptation of national legislation to the provisions of Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on combating the dissemination of terrorist content online (Decreto Legislativo 24 luglio 2023, n. 107, Adeguamento della normativa nazionale alle disposizioni del regolamento (UE) 2021/784 del Parlamento europeo e del Consiglio, del 29 aprile 2021, relativo al contrasto della diffusione di contenuti terroristici online), Article 4.
are courts involved, to varying degrees, in the assessment of the legality of content and issuing removal orders.

This necessarily leads to the involvement of authorities with different levels of expertise in the field of counterterrorism, on the one hand, and fundamental rights, on the other, depending on the particular set-up in each Member State. This, in turn, exacerbates the challenges related to a uniform application of the definitions and the assessment of what terrorist content is, and may further impact the clarity and foreseeability of how the regulation is applied and how fundamental rights are safeguarded in the process.

From the fundamental rights point of view, when assessing the nature of online content and the need for its removal, an independent judicial authority would be best placed to make an impartial decision to meet public security needs without violating fundamental rights, an argument voiced during the discussions on the draft regulation [38]
 See, for example, Kuczerawy, A., The proposed regulation on preventing the dissemination of terrorist content online: Safeguards and risks for freedom of expression, Center for Democracy and Technology, 5 December 2018, p. 10; FRA, Proposal for a regulation on preventing the dissemination of terrorist content online and its fundamental rights implications – Opinion of the European Union Agency for Fundamental Rights, Publications Office of the European Union, Luxembourg, 2019, pp. 23–26.
. This was also emphasised by some experts from civil society / academia interviewed for this research, some of whom stated that even if the body making the assessment is not a court, it needs to be independent of the interests involved, both those of law enforcement and those of HSPs. This is of particular importance given that removal orders have an immediate effect on fundamental rights while a subsequent remedy may have only a limited restorative effect (see Chapter 4). Concerns over which entity would be designated as the national competent authority for issuing removal orders featured prominently in parliamentary and public debates surrounding the implementation of the regulation across Member States. Some of the concerns raised in these discussions related to an alleged risk of online monitoring and abuse by law enforcement or intelligence services [39]
 See, for example, Nomoplatform, ‘Εναρμόνιση πρόληψης διάδοσης τρομοκρατίας στο διαδίκτυο και διαχείριση περιουσίας ανίκανων προσώπων’ (‘Harmonising the prevention of the dissemination of online terrorism and the management of assets of incapacitated persons’), Nomoplatform website, 9 September 2024; National Council of the Judiciary of Poland, ‘Opinion on the draft law amending the Anti-Terrorism Act and the Act on the Internal Security Agency and the Intelligence Agency’, 28 November 2024.
, questions of independence of the designated competent authorities [40]
 See, for example, the Danish Parliament (Folketinget), Parliamentary debate (L 166 Forslag til lov om supplerende bestemmelser til forordning om håndtering af udbredelsen af terrorrelateret indhold online), pp. 27–31; the Minister of Justice and Security of the Netherlands (Minister van Justitie en Veiligheid), ‘Explanatory Memorandum. Regulation on preventing the dissemination of terrorist content online implementation Act’, (‘Memorie van Toelichting. Uitvoeringswet verordening terroristische online-inhoud’), 2022, pp. 15–16.
or calls to involve judicial authorities [41]
 See, for example, Arangüena Fanego, C., ‘New steps against terrorism in the EU: Regulation (EU) 2021/784 and orders for the removal of terrorist content online’, Revista de Estudios Europeos, 2023; the Assembly of the Republic of Portugal (Assembleia da República), Draft Law 44/XVI. Proposed amendments to Article 2 (Proposta de Lei 44/XVI. Propostas de alteração ao artigo 2.º), 5 February 2025.
. In some Member States, these concerns led to amendments integrating additional safeguards [42]
 In Slovenia, for example, only one court (District Court Nova Gorica) can issue removal orders. This model was selected to ensure proportionality of measures and legal certainty, given that removal orders are considered to interfere with constitutional rights, and also to ensure specialisation and a uniform decision-making practice. See the National Assembly of Slovenia (Državni zbor Republike Slovenije), ‘23rd Regular meeting of the Committee on Home Affairs, Public Administration and Local Self-Government’ (‘Odbor za notranje zadeve, javno upravo in lokalno samoupravo’), selected transcript of the meeting (Izbrani zapis seje), 9 October 2024.
.

When it comes to tasking law enforcement and intelligence authorities with assessing the nature of online content and the need for its removal, some civil-society/academia and HSP experts raise concerns in relation to the rule of law, institutional priorities and the absence of checks and balances. In the case of media regulators, interviewees’ concerns focus on expertise in dealing with terrorist content and a potential conflict with their mandate to regulate internet platforms.

Positively, interviews with experts from competent authorities and civil society / academia show that in several Member States, additional safeguards are in place in the form of internal review mechanisms introducing layers of oversight. In some cases, draft removal orders are reviewed by superiors before they can be issued. In others, multiple staff with different expertise are involved in assessing content, either in a formalised manner or by means of ad hoc consultations. In some competent authorities, internal boards review selected cases, such as those which are not fully clear or are considered to be of a precedential nature.

I was not sure [our authority] would be assigned this role, we thought it would be a judge. But finally, it was decided that we do it […] and it does work.

Competent authority expert

Some experts from authorities tasked with issuing removal orders or scrutiny of cross-border removal orders acknowledge that the regulation requires them to carry out tasks that go beyond their existing expertise. In particular, as regards assessing the impact of the removal of content on fundamental rights, some of these experts consider courts to be better equipped for the task. Most, nevertheless, indicate that the transition has been successful.

Interviewees across professional groups also underline the importance of sufficient resources. Experts from civil society / academia and HSPs note that capacity across Member States in terms of staffing, training to assess content and language expertise differs, which may be one of the reasons for the uneven use of the regulation across the EU.

In terms of capacity, it’s not always easy to work on daily basis. There are tasks for [the] TCO [regulation], but there are also other things to do.

Competent authority expert

This is confirmed by experts from competent authorities. While some of them consider the resources available to them sufficiently robust, a number of experts note that the application of the regulation resulted in an increase in their workload, which has not necessarily been accompanied by a commensurate increase in resources. Recalling the vast amounts of online material, some of these experts note that resources limit their ability to focus on online terrorist content on a daily basis or to specialise in detecting and assessing particular types of content.

Besides the need to safeguard freedom of expression and information more broadly, Article 1(3) of the regulation recognises that material disseminated for certain legitimate purposes warrants special protection – a provision that a number of interviewees consider to be an important safeguard.

In general, interviewees across professional groups acknowledge the potential impact of the regulation on protected forms of speech and the importance of carefully assessing whether particular content may fall into this category, which may be difficult given the definitional issues surrounding terrorist content and, in particular, glorification. Some interviewees recall past cases, unrelated to the regulation, where the content of researchers working on terrorism was removed. Such risk may arise, for example when it comes to events that involve terrorist organisations and are heavily covered by media reporting and academic research, such as the situation in Israel and Gaza following the attacks of 7 October 2023. Some interviewees point out to a possible interplay with discrimination if some languages or backgrounds are associated with terrorism more than others. As an example of such concerns, a civil-society/academia expert notes that some academics working on terrorism-related topics might be at a higher risk of having their content removed than others, due to factors such as their name or the language in which they write.

In terms of artistic expression, multiple interviewees draw parallels between terrorist content and violent content more generally and point to a case dealt with by the Oversight Board in 2022 concerning removals of drill rap music [43] A subgenre of rap, drill features confrontational lyrics and deals with themes such as gang rivalries and violence.
content based on referrals by the UK Metropolitan Police due to alleged threats of violence. The Oversight Board overturned these decisions, namely arguing the absence of sufficient evidence that the content contained a credible threat, and for the need to give more weight to its artistic nature [44]
 Oversight Board, ‘UK drill music’, Oversight Board website, 22 November 2022.
. Interviewees, including some experts from competent authorities, note that this case illustrates possible challenges when it comes to terrorist content and artistic expression.

To be fair, I believe competent authorities already have enough work to do and if they see that something is from a trusted source, like a think tank or a researcher, this would not be the content they would focus on, for the most part.

Civil-society/academia expert

Respondents across professional groups generally state that concrete experience with having to assess whether potential terrorist content falls under the protective provision of Article 1(3) is limited so far. Explaining these points - one civil-society/academia expert refers to the limited capacity of competent authorities, which forces them to focus on clearly terrorist content only, such as propaganda disseminated directly by terrorist organisations, which reduces the risk of overreaching to such protected forms of content; and an expert from a competent authority shares a practical example of a video that contained what would otherwise be clearly considered terrorist content, but that was shared by a university for educational purposes rather than promoting terrorism. At the same time, some experts from competent authorities point to the existence of outlets belonging to actual terrorist organisations and intended to disseminate their propaganda (which would obviously not benefit from the protection envisaged in the regulation). Other interviewees point to the tactic of some malicious actors to disseminate terrorist content under the guise of journalistic or educational purposes in order to avoid enforcement, choosing platforms that do not apply stringent scrutiny to such content.

For media […], it’s very difficult to assess, and I don’t think we are going to remove it that easily. Educational platforms, we don’t touch them, if we know them.

Competent authority expert

Interviewees from competent authorities also outline some approaches that, in their view, mitigate the risk of targeting content disseminated for such legitimate purposes. For example, one expert explains that they avoid targeting educational platforms, especially if they are familiar with them or if their review confirms the platform’s educational nature. Some others recall that the general principle of assessing the content as well as its context (e.g. not just the actual footage but also the circumstances of its dissemination) is of particular importance when it comes to correctly recognising educational, journalistic or similar content.