The controversial EU regulation on terrorist content online is making its way through the European Parliament close to the end of the mandate – traditionally a time MEPs face an extreme amount of pressure to pass laws that may have otherwise received more time and attention.

On Monday, April 8, the Civil Liberties, Justice and Home Affairs Committee (LIBE) Committee adopted its position on the regulation, which I last reported on in February. While the Committee made some important improvements, the plenary will have to defend them against attempts by the largest group in the European Parliament, the conservative EPP, to undo those changes in the upcoming vote on Wednesday, April 17.

Improvement: No forced upload filters

We managed to push back on upload filters, which are included in the Commission proposal in Article 6. The text adopted by the LIBE Committee makes explicit that the state authorities who can order platforms to remove material they consider terrorist content cannot impose obligations on web hosts to monitor uploads, nor to use automated tools for that matter.

Instead, the text calls for “specific measures” that hosts can take in order to protect their services (see Article 6). These measures can range from increasing human resources to protect their service to exchanging best practices (see Recital 16). But regardless of the measure chosen, hosts must pay particular attention to users’ fundamental rights. This clarification is a major victory, considering that the introduction of upload filters seems to be the main objective of the European Commission proposal.

The EPP is against this change and has tabled amendments that would re-introduce the possibility for authorities to force platforms to use upload filters. The plenary must reject the EPP’s pro-upload filter amendment 175 and adopt the LIBE position against upload filters (Amendments 84 to 89)!

Improvement: No arbitrary standards for deletion

In addition to having to act on removal orders, platforms were to receive “referrals” of content that may or may not be considered terrorist content, which they could then voluntarily assess not by standards of law, but their self-set arbitrary terms of service (see Article 5). Rightfully, the LIBE Committee realised that this would set a dangerous precedent for the privatisation of law enforcement and deleted the provision. While platforms will still undoubtedly make mistakes when removing content, they will at least have to judge by definitions of illegal terrorist content the EU set two years ago.

The EPP group has tabled amendments to try to re-introduce Article 5. On Wednesday, we will have to make sure that Article 5 stays deleted!

Filters through the back door?

Unfortunately, the unreasonable Commission proposal that illegal terrorist content must be taken down within one hour remains the default in the report adopted by the LIBE committee (see Article 4). The only exception to this rule is for the very first time a website owner receives a removal order from an authority, in which case they get 12 hours to familiarise themselves with the procedure and applicable deadlines. Afterward, regardless of platform size or resources, they must react within one hour in order to avoid harsh penalties. These penalties may amount to 4% of a platform’s turnover in case of persistent infringements (see Article 18).

A one-hour deadline is completely unworkable for platforms run by individuals or small providers, who have no capacities to hire staff tasked with handling potential removal orders 24/7. No private website owner can be expected to stay reachable over night and during weekends in the unlikely event that somebody uploads terrorist material. Their only realistic option available would be the automation of removals: Terrorism filters through the back door.

Blindly deleting or blocking flagged content without review is bound to lead to the deletion of legal uploads, such as a news report on terrorism that shows footage from a war zone, which may indeed be illegal in another context. Already today, there are plenty of examples of overzealous administrative authorities flagging perfectly legal material as terrorist content (Note: Unlike the title of that post suggests, these notices didn’t come from an EU agency, but a French national authority). Thus it’s imperative that websites of all sizes have the necessary time to review reports.

A joint attempt by the Greens/EFA and GUE groups to give providers more time to react was rejected by the LIBE Committee. Amendments by Greens/EFA, GUE and the S&D group will be put to the vote on Wednesday once more to try to get rid of the unworkable one hour deadline.

What’s next and what you can do

The European Parliament will vote on its position on the terrorism regulation this week on Wednesday, 17 April. After this position is adopted, the next European Parliament will start trilogue negotiations with the Council, which has already rubber-stamped the Commission proposal, including mandatory upload filters.

Call your MEPs (consult the list here) and ask them:

  1. Support the amendments proposed by Greens/EFA (Amendment 157), S&D (Amendment 160) and GUE (Amendment 164) to delete the strict one hour deadline.
  2. Don’t support amendments which would undo the wins for a free internet by re-introducing upload filters or referrals!

Due to the incredible pressure to pass this legislation before the European Parliament election, not even a debate on the proposal is planned. Time pressure is no reason to pass a law that has fundamental flaws and would be a threat to our fundamental rights online!

To the extent possible under law, the creator has waived all copyright and related or neighboring rights to this work.

One comment

  1. 1
    Henry Crawford

    If:
    These penalties may amount to 4% of a platform’s turnover

    Then surely a small or private website owner may find it cheaper to simply pay the fine, rather than even bother to try to comply with the law?