Help us make the FRA website better for you!
Take part in a one-to-one session and help us improve the FRA website. It will take about 30 minutes of your time.
Terrorist content online presents a key threat to fundamental rights, democracy and the rule of law. The EU’s legislative and other efforts to address such content can serve to underpin fundamental rights and the prevention of terrorism.
This report presents the findings of the European Union Agency for Fundamental Rights (FRA) on the impact on fundamental rights of the application of Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online, as the key EU instrument in this field.
Terrorist content proliferates in the online environment alongside hybrid threats, disinformation and other security threats. It can manifest itself in a variety of ways, ranging, for example, from propaganda materials published directly by terrorist organisations on their websites, to footage of terrorist attacks disseminated on online platforms, to social media posts inciting the commission of terrorist offences. The dissemination of such content poses significant security risks and can impact fundamental rights, including the right to life and human dignity – to name just two.
In an effort to support the fight against terrorism, the findings in this report – based on the experiences of practitioners and other experts working with the regulation – can be used to enhance the regulation’s application, where needed.
While the analysis contained in the report focuses specifically on the application of the regulation in practice, its findings also contribute to discussions on the broader challenges of regulating freedom of expression online beyond the context of terrorist content.
The regulation provides national competent authorities, such as law enforcement agencies and media regulatory bodies, with the possibility to order social media platforms and other hosting service providers (HSPs) anywhere in the EU – including HSPs located outside the EU but providing services within its territory – to promptly remove content disseminated to the public on their platforms, which authorities consider to amount to terrorist content. Furthermore, HSPs considered to be exposed to terrorist content can be ordered to adopt specific measures to counter this exposure. In doing so, it complements existing voluntary cooperation between authorities and HSPs, such as the EU Internet Forum and the Code of Conduct on Countering Illegal Hate Speech Online, with a set of enforceable tools. The EU Platform on Illicit Content Online (PERCI), established by the European Union Agency for Law Enforcement Cooperation (Europol), supports EU Member States in applying the regulation.
The regulation recognises the need to respect fundamental rights when implementing its provisions. Specifically, Article 23 requires the European Commission to assess the regulation’s impact on fundamental rights. The Commission requested FRA carry out research in connection with this assessment.
This report presents the main findings from FRA’s research, which can serve to support the evaluation of the regulation, and offers FRA’s own independent assessment and conclusions. It provides insights into the experiences of practitioners and other experts with in-depth knowledge in this field, with the practical application of the provisions of the regulation at the national level. These empirical findings confirm that while the regulation serves an important and legitimate goal, it also affects a wide range of fundamental rights and freedoms that the European Union Charter of Fundamental Rights (the Charter) and international human rights instruments safeguard. In this respect, it speaks to some of the concerns expressed by national and EU stakeholders, international human rights bodies, civil-society organisations, professional associations, academics and HSPs in relation to the draft regulation when it was proposed in 2018.
Rights of the Charter that it most directly affects include freedom of expression and information (Article 11), the right to respect for private and family life (Article 7), protection of personal data (Article 8), freedom of thought, conscience and religion (Article 10), freedom of assembly and of association (Article 12), freedom of the arts and sciences (Article 13), freedom to conduct a business (Article 16), non-discrimination, including on the grounds of ethnic origin, religion or belief (Article 21) and the right to an effective remedy and to a fair trial (Article 47).
This set of findings draws from interviews with 62 experts, including practitioners from selected competent authorities who are involved in applying the regulation, ranging from law enforcement and counterterrorism agencies to regulatory bodies, and experts from HSPs affected by the regulation and civil-society and academic experts focusing on the topic. Limited desk research in 27 Member States supported the fieldwork, collecting basic information about the legal and institutional framework supporting the application of the regulation at the national level.
Combating terrorist content while respecting fundamental rights is a complex and challenging task. Findings from FRA’s research show that competent authorities applying the regulation are generally aware of the potential fundamental rights impact of their work and undertake efforts to target only content that is clearly terrorist content in its nature.
Still, a number of challenges emerge as regards the impact that applying the regulation has on fundamental rights, as the findings show. The report brings the findings to the attention of the EU institutions and Member States and can help them assess the need for further steps to ensure that the application of the regulation complies fully with fundamental rights.
As the fieldwork interviews covered authorities in a limited number of Member States, alongside selected HSPs and other experts, the findings do not claim to be representative of the situation in the EU or for HSPs as a whole. In addition, with the regulation applicable since July 2022, the degree of its use varies across the EU, which limits practical experience with some of its key elements. Nevertheless, the results provide a valuable insight into how experts who apply the regulation in their work, or are directly affected by its application, experience its impact on fundamental rights.
Article 2(7) of the regulation defines terrorist content by means of its relationship with one of the terrorist offences established under Directive (EU) 2017/541 on combating terrorism, such as content soliciting, inciting or threatening to commit one of these offences. Article 1(4) states that the regulation shall apply without prejudice to freedom of expression and information, including freedom and pluralism of the media, under Article 11 of the Charter. In addition, Article 1(3) in conjunction with Recital 12 stipulates that content disseminated for educational, journalistic, artistic, research or counterterrorism purposes, and material expressing polemic or controversial views in the public debate, will not be considered terrorist content.
Interviewees express concern that the definition of terrorist content provided by the regulation, which determines the scope of application of the instrument as a whole, does not offer sufficient clarity as to what content can be considered terrorist content and liable to removal. This can hamper the uniform application of the regulation across the EU and, in relation to different types of terrorist content (see FRA opinion 8), reduce foreseeability and result in risks for the freedom of expression and information and a variety of other rights, as content that is not terrorist content in its nature could be removed. This is further compounded by the diversity of authorities across the EU that use the definition when applying the different provisions of the regulation. While these authorities play a key role in ensuring a fundamental-rights-compliant application of the regulation, they are seldom judicial bodies and are equipped with a different degree of fundamental rights expertise and resources.
Findings show that the inclusion of ‘glorification’ in the definition of material inciting a terrorist offence, one of the types of terrorist content, poses a particular challenge for competent authorities to establish a clear line between terrorist content and permissible forms of expression, which may in some cases include expressing radical or controversial views. Although competent authorities focus their efforts on capturing content that is clearly terrorist content in nature, encompassing the concept of glorification – rather than focusing on direct incitement to commit a terrorist offence – increases the risk of overreaching to legitimate content, including political opinions.
There is still limited experience with cases of educational, journalistic, artistic or research-related content, and specific safeguards have been implemented in some Member States to avoid such content being subject to removal. However, interviewees express concerns that reporting or research on particular conflicts or events that involve terrorist groups or trigger polarised public debate might be impacted.
The Commission should consider reviewing the definition of terrorist content, such as the reference to ‘glorification’, and providing further clarity. This would facilitate its use by competent authorities, strengthen the foreseeability of the application of the regulation for HSPs and users of their services and help provide for a comparable level of fundamental rights safeguards across the EU.
The Commission and Europol should, within their respective mandates, continue to facilitate discussions, promote exchange of experience and provide technical support to Member State authorities to support uniform application of the regulation, including a common, fundamental-rights-compliant interpretation of the definition of terrorist content.
Member States should ensure that competent authorities are well-equipped to interpret the definition of terrorist content when applying the different provisions of the regulation in a manner that safeguards fundamental rights and they have sufficient resources for their tasks, including by making available guidance and training where needed.
Removal of online content affects freedom of expression and information (Article 11 of the Charter) and can affect a broader range of other fundamental rights, including freedom of thought, conscience and religion (Article 10 of the Charter), freedom of assembly and of association (Article 12 of the Charter), freedom of the arts and sciences (Article 13 of the Charter) and non-discrimination (Article 21 of the Charter).
According to Article 3 of the regulation, competent authorities can issue removal orders which require HSPs to remove, or disable access to, terrorist content in all Member States within one hour. When receiving a removal order, the regulation does not envisage HSPs reviewing whether the content indeed constitutes terrorist content or granting them a possibility to contest the removal order prior to its execution. Together with a risk of penalties for non-compliance, this may incentivise HSPs to take down content even if they consider the assessment of the competent authority to be erroneous. FRA’s findings show that particularly smaller HSPs or those without particular fundamental rights expertise are likely to fully defer to the expertise of the authorities. While interviewees across all professional groups acknowledge the need for speedy takedowns of content posing a particular risk, some question whether the same urgency applies to all types of content falling within the scope of the regulation and the impact this may have on the rights of HSPs and users whose content may be subject to removal (content providers).
In addition, many competent authorities continue to prefer the use of referrals. The regulation (Recital 40) does not preclude the Member States from using referrals as a tool for voluntary cooperation by which national law enforcement or counterterrorism authorities (and Europol) inform HSPs about potential terrorist content detected on their platforms, allowing HSPs to review it based on their terms and conditions. Findings show that competent authorities often consider referrals more practical and agile than removal orders. However, interviewees express concerns over the interplay between the use of removal orders and referrals, stating that the regulation does not sufficiently clarify the relationship between the two tools and differentiate as to when removal orders and referrals, respectively, should be used. Namely, interviewees note that the prospect of receiving a removal order might incentivise some HSPs to remove content based on a referral without a meaningful assessment. At the same time, referrals are not accompanied by the safeguards envisaged by the regulation for removal orders and their use may raise questions of accountability of public authorities and private players for removing content.
Under Article 5, competent authorities can designate HSPs that have received two removal orders within the last 12 months as ‘exposed to terrorist content’ and oblige them to implement ’specific measures’ (such as additional technical means to identify and remove content) to counter such exposure. When putting in place such specific measures, Article 5(3) in conjunction with Recital 23 requires HSPs to ensure that users’ fundamental rights, in particular freedom of expression and information, respect for private life and protection of personal data and the right to non-discrimination, are preserved.
Yet, FRA findings show that the prospect of being designated as ‘exposed to terrorist content’ can increase the risk that HSPs over-moderate legitimate content to pre-empt such designation. Furthermore, once ordered to implement specific measures, the requirement to more effectively combat the presence of terrorist content on their platforms may motivate HSPs to employ further restrictive policies and intrusive tools, potentially resulting in the over-blocking of legitimate content and the general monitoring of content on their platforms. At the same time, while competent authorities are obliged to undertake a review of the application of specific measures by HSPs to ensure that they comply with all the requirements under the regulation, including the one under Article 5(3), FRA’s findings show that competent authorities would need more guidance about how to assess such an impact.
The Commission could consider adjusting the mechanism for using and processing removal orders envisaged in the regulation. Namely, where a duly justified concern exists that the content does not meet the definition of terrorist content, the regulation should provide HSPs with an effective possibility to challenge the removal order before removing the content.
Furthermore, the Commission should clarify which situations and types of content justify the use of, respectively, removal orders and referrals, to increase legal clarity and foreseeability and give full effect to the safeguards envisaged by the regulation.
The Commission should issue guidance to HSPs and Member States on how to implement specific measures in a manner that respects fundamental rights, including that such measures do not result in general monitoring of online content. To this end, the Commission could consider providing a more concrete list of available specific measures in the regulation. Member States should ensure that competent authorities have effective systems in place to monitor specific measures implemented by HSPs, and that such monitoring pays due attention to the impact of the measures on fundamental rights, such as freedom of expression and non-discrimination. Appropriate guidance and training should be made available to relevant competent authorities, ensuring that they are equipped with sufficient expertise and knowledge when it comes to assessing the impact of specific measures on fundamental rights.
In line with Article 21 of the Charter, Recital 10 prohibits any discrimination when applying the regulation. This prohibition applies both to authorities issuing removal orders and to HSPs when they apply specific measures pursuant to Article 5. More broadly, the DSA contains a general requirement for HSPs to act with due regard to fundamental rights of the recipients of their services (Article 14 DSA) and an obligation for those HSPs which are very-large online platforms or very-large online search engines to assess the risk of discrimination, including when using algorithmic systems (Article 34 DSA).
Findings show that detection by competent authorities and content moderation by HSPs focuses predominantly on jihadist content, which is considered a key security threat in the EU. At the same time, content related to particular topics such as sensitive current political issues, or in particular languages, is challenging to correctly assess. Together with the clarity issues associated with the definition of terrorist content, these factors increase the risk of removal of legitimate content that disproportionately impacts content providers based on their ethnic origin, language, religion or belief, or political opinion, amounting to discrimination. According to the research, Muslims and Arabic speakers are at heightened risk.
HSP content moderation, driven partly by regulatory pressures, including the regulation and other EU and national laws, relies increasingly on automated tools despite persisting concerns over their reliability. FRA’s findings show that this is not necessarily compensated for by sufficiently robust human oversight, as human review of content flagged by automated tools can be limited due to factors such as time constraints, language expertise and inadequate working conditions of content moderators. This can impact a wide range of rights of online users, including the right to non-discrimination.
Furthermore, interviewees express concern that the sense of over-moderation of content - particularly in the case of HSPs' own online content moderation measures - may lead people from affected communities – and beyond – to abstain from exercising their rights online due to a fear of becoming persons of interest for counterterrorism authorities or having their profile and channels of communication blocked by HSPs. Such a chilling effect can affect very large numbers of people and extend from freedom of expression and information to other rights, such as freedom of assembly and association. Reports by international organisations and bodies acknowledge the risk of a ‘chilling effect’ on rights in the context of counterterrorism measures.
Member States should ensure that competent authorities applying the regulation are adequately equipped to carry out their tasks in a manner fully consistent with the prohibition of discrimination. Appropriate guidance and training should be made available, ensuring that language or association with a particular religion do not play a disproportionate role when deciding on the terrorist nature of online content. In this context, Member States should consider regularly reviewing the removal orders and referrals, as appropriate, issued by their competent authorities to detect any risk of discrimination. The European Commission could assist Member States in this regard by issuing guidance supporting a harmonised approach.
Furthermore, the European Commission should, including in the context of enforcing other applicable EU legislation, take measures to ensure that HSPs effectively safeguard the right to non-discrimination while diligently performing their online content moderation obligations to counter online terrorism.
Articles 7 and 8 in the regulation, respectively, stipulate transparency obligations of HSPs and competent authorities, including publishing annual transparency reports. The transparent provision of information can enhance accountability, identify the risk of over-removal of content and is essential for the evaluation of the regulation’s impact on fundamental rights.
FRA’s findings show that the information provided in these reports varies in quality and scope. The transparency obligation of Member States under Article 8 includes only basic information about removal orders and none about referrals, limiting their information value. Transparency reports by HSPs lack granularity and data comparability across the industry due to issues such as the use of different definitions of terrorist content, lack of reporting on referrals and disaggregation by categories such as region or language. As a result, transparency reports do not provide the information needed to detect potential risks to fundamental rights stemming from the application of the regulation and broader HSP content moderation policies.
The Commission could consider strengthening the transparency requirements under the regulation with respect to competent authorities, ensuring in particular that the use of referrals is covered by the reporting obligation to reflect their important interplay with removal orders. This could be further supported by making publicly available statistical data about the use of PERCI, managed by Europol, for example as regards the types of terrorist content targeted by competent authorities.
The Commission could consider strengthening the transparency requirements under the regulation with respect to HSPs, ensuring sufficient granularity and comparability of reported data. This could entail using clear and harmonised categories among HSPs when it comes to what content is reported as terrorist content, distinguishing between content detected by HSPs and content flagged by authorities (referrals), and reporting data in a disaggregated manner (including separate data for the EU).
Article 52(1) of the Charter requires that limitations of fundamental rights are necessary and proportionate to the objectives pursued. While the regulation pursues a legitimate goal of addressing the dissemination of terrorist content online, interviewees express concern over its impact on fundamental rights, considering factors that limit its effectiveness in achieving this objective.
FRA’s findings show that a variety of factors result in a situation where competent authorities issue removal orders to a limited number of HSPs, not necessarily reflecting the spread of terrorist content across the online environment. This includes factors related to the conduct of certain HSPs, such as their lack of cooperation with competent authorities or the failure to designate legal representatives, pursuant to Article 17, of those HSPs that offer services within the EU but are established elsewhere. Other factors relate to the capacities of some competent authorities, including gaps in the mapping of HSPs in their jurisdiction and limited resources. This reduces effectiveness and at the same time places an undue burden on some HSPs. Furthermore, findings show that, despite awareness-raising and capacity-building efforts supported by the Commission, there is still limited awareness of the regulation, especially among smaller HSPs, and that HSP efforts to comply with EU law in the field of online content focus predominantly on compliance with the more extensive due diligence and transparency reporting requirements of the DSA.
The fact that the vast majority of removal orders focus on jihadist content reflects that this type of terrorism is considered a key security threat in the EU, but – as interviews noted – also shows a possible imbalance with respect to the amount of content related to other types of terrorism and the threat it poses, in particular when it comes to right-wing terrorist content which has been a growing concern for counterterrorism experts. In this context, interviewees report challenges in applying the regulation to right-wing content. Findings show that this phenomenon is not limited to competent authorities, as HSP content moderation frequently pays limited attention to right-wing terrorist content.
Tackling these gaps in effectively and comprehensively addressing the dissemination of terrorist content online is also important from the perspective of necessity and proportionality, considering the impact the application of the regulation can have on a variety of fundamental rights guaranteed by the Charter. This should be considered in conjunction with other concerns identified in the research and outlined in other key findings of this report, such as the risks stemming from the broad definition of terrorist content, the application of the one-hour rule, the threshold for designating HSPs as exposed to terrorist content after only two removal orders or the potential disproportionate impact on the freedom of expression and other rights as a result of implementing specific measures.
The Commission should take stock of the application of the regulation so far with respect to different types of terrorism, such as jihadist and right-wing terrorism, and consider providing guidance on its applicability in this respect. It could also support Member States in the mapping of HSPs within their jurisdictions, to ensure that competent authorities are aware of relevant HSPs beyond those that are very well known. Member States should take steps to ensure that when detecting terrorist content and issuing removal orders, due attention is given to all relevant HSPs and to all types of terrorism, including, notably, by strengthening the focus on right-wing terrorist content. Europol could support competent authorities of the Member States in this regard. Member States should ensure that competent authorities are equipped for this purpose, including having the necessary human and financial resources.
The Commission, Member States and Europol should, within their respective mandates, consider measures to better enforce the obligation under Article 17 to designate a legal representative for all HSPs covered by the regulation that do not have their main establishment in the EU. The Commission and Member States could further promote awareness of the regulation among HSPs and the obligations stemming from it, including the interplay with other EU law that applies to HSPs in the area of regulating illegal content online, such as the DSA.
Article 47 of the Charter guarantees the right to an effective remedy. The regulation contains several mechanisms in this regard, including the possibility to challenge removal orders and other decisions in court (Article 9), and to have removal orders issued by competent authorities of another Member State scrutinised by an authority of the Member State where the HSP is established (Article 4, in conjunction with Article 12(1)(b)). It also requires HSPs to have in place complaint mechanisms for content providers (Article 10).
FRA’s findings reveal several limitations and concerns related to the effectiveness of these important safeguards. This includes insufficient clarity concerning when to scrutinise cross-border removal orders and which criteria to apply, along with the risk that scrutinising competent authorities overly rely on the expertise and assessment of the issuing competent authority, potentially rendering such scrutiny ineffective.
Concerning the possibility of seeking judicial remedy against the removal of content, the research shows that incentives to do so may be low due to the time-sensitive nature of online content and its loss of relevance, especially if it relates to current events. Furthermore, while HSPs may not be motivated to challenge the decision of competent authorities, content providers might be prevented from doing so by the complexity of initiating proceedings in another Member State than their own. In some cases, content providers might also not be properly informed in the first place about the removal of their content by the HSP (something that the regulation permits only in cases where the competent authority prohibits such disclosure, for a limited time, due to reasons of public security). In this regard, processes established in some Member States to provide a degree of oversight over the issuing of removal orders, such as by involving external bodies in approving the removal order, can act as a safeguard, provided that the body that carries out such external oversight possesses sufficient expertise, capacity and independence.
The accessibility and effectiveness of complaint mechanisms set up by HSPs may likewise be limited by factors such as the inadequate provision of information to content providers about the removal of their content, or the employment of automated tools in dealing with complaints without effective human oversight.
Finally, given the gradual application of the regulation by Member States, these safeguards have been used to a limited degree so far and, in the case of the possibility to challenge removal orders in court, not at all. Bearing in mind the importance of access to an effective remedy for safeguarding all fundamental rights affected by the regulation, this renders it difficult to objectively assess the full impact of the regulation on fundamental rights and the functioning and effectiveness of its safeguards as envisaged in Article 23. To this end, the findings in this report – based on interviews with experts – offer a good basis from which to mitigate upstream any fundamental rights concerns.
All Member States should effectively implement Article 4, in conjunction with Article 12(1)(b), as a key safeguard under the regulation in case of cross-border removal orders. They should provide clear guidance to the competent authorities as regards the application of the scrutiny of such removal orders, ensuring that it is conducted systematically and in a comprehensive and objective manner.
The Commission could consider enhancing the accessibility of remedies for content providers by making it obligatory for HSPs to inform content providers of the reasons for the removal and their rights to challenge the removal order, without the need for content providers to request such information. This is without prejudice to the exception for reasons of public security envisaged in Article 11(3).
Furthermore, Member States could consider steps to ensure effective oversight in the course of issuing removal orders. This would offer an additional safeguard, given that once content has been removed, the effectiveness of existing remedies appears to be limited in practice.
Finally, the Commission should consider conducting an evaluation of the regulation once all of its main elements have been used in practice to a sufficient degree. This is, in particular, the case of access to remedies pursuant to Article 9, which needs to be a central element of any evaluation of the regulation’s fundamental rights impact.
The regulation aims to contribute to the protection of public security by addressing the proliferation of terrorist content online in a manner that respects fundamental rights and contains a set of safeguards to this end.
As a horizontal finding common to several thematic findings in this report (see notably Opinions 2, 3, 5, 6 and 9), the research points to a number of areas where the application and interpretation of the regulation vary, both among competent authorities and among HSPs, with potential implications for its effective enforcement as well as for the level of protection of fundamental rights.
This report highlights a number of existing initiatives and practices at the EU and Member State levels that aim to support the application of the regulation in a manner that helps safeguard fundamental rights, including through the provision of additional guidance and training, exchange of experience and awareness-raising. Further enhancing these efforts would be an important step supporting a uniform application of the regulation in line with fundamental rights.
The Commission and Member States should, based on their respective spheres of competence, support the application of the regulation by providing appropriate guidance and training to the staff of competent authorities, as well as by enhancing the awareness about the regulation and its relevant provisions and applicable obligations among HSPs. These should be based on evidence indicating the main challenges in the application of the regulation, drawing upon relevant sources of expertise – including fundamental rights – and building upon existing initiatives where appropriate.