The European Commission’s knee-jerk response to online content governance during the Hamas-Israel war may jeopardise
human rights. The Commission’s recommendation, proposing a temporary incident response mechanism as a part of the Digital Services Act (DSA) framework, is a
fast-tracked “solution” to the spread of illegal content online that could result in long-term consequences to freedom
of speech. Access Now will be vigilantly monitoring enforcement of the recommendation to ensure the highest protective
standard of fundamental rights for all.
“Adequate response to the spread of illegal content online must always be — especially during war and crises —
proportionate and meet the criteria of due process,” said Eliška Pírková, Global Freedom of Expression Lead at Access Now. “Anti-terrorist measures often carry inherent racist bias against historically oppressed groups and work to silence
those standing up against ongoing human rights abuse. The fight against terrorism isn’t free reign for authorities to
quash our fundamental rights.”
The proposed incident response mechanism would remain in place until the DSA fully enters into force in February 2024,
and calls for better coordination between Member States and the Commission to avoid a fragmented response to the
amplification of illegal content online. The Commission also requests the creation of an informal network of national
public authorities to temporarily fill the gap caused by the current lack of institutional framework responsible for the
DSA enforcement at national level.
Access Now recognises the urgent need for effective cooperation in order to adequately tackle the spread of illegal
content, particularly when people's safety and security is being threatened, but, as it stands, the recommendation
contains proposals that can trigger human rights violations.
The text calls on Member States to promptly issue removal orders against Very Large Online Platforms (VLOPS) — including
Instagram and YouTube — while relying on a cross-border mechanism in the EU Regulation addressing the dissemination of terrorist content online (TERREG). TERREG has been widely criticised by human rights experts as, among other red flags, it allows any national competent authority to order the deletion of
online content, hosted anywhere in the EU, within one hour.
Pressing Member States to issue removal orders can create enforcement overreach, leading to cross-border and worldwide removal orders, encouraging private actors to preemptively block legal content
to avoid any threat of liability. At the same time, the text reminds private actors of their voluntary commitments in the EU Code of Conduct on countering illegal hate speech online (EU Code of Conduct), emphasising the 24-hour deadlines for content removals, creating space for online censorship.
There are numerous and long-term reports demonstrating that Palestinian content — and Arabic content at large — is regularly censored by VLOPS due to arbitrary
and erroneous overenforcement of anti-terrorism policies.
“Freedom of expression should not be another casualty of war,” said Marwa Fatafta, MENA Policy and Advocacy Manager at Access Now. “Platforms' anti-terrorism content moderation policies have been the culprit behind systematically censoring
journalists, human rights activists, and oppressed communities in Palestine and across the MENA region. The EU must be
wary of its actions and their spillover effects, potentially undermining people’s fundamental rights beyond their
borders.”
The text also reveals that authorities are revising the Code to introduce commitments to anticipate “threats of waves of
illegal hate speech before content has gone viral online” — without publicly disclosing the process. This risks the
accountability, transparency, and proper stakeholder engagement that have surrounded the Code since its launch in 2016.
The European Commission must ensure that the recommendation implementation does not enable discrimination against
historically oppressed groups or in at-risk situations. Access Now urges the Commission and national regulators to
uphold the essence of the DSA, and minimise biases against persons in vulnerable situations.