Leading NGOs believe the proposed rules would encourage Internet giants to discriminate against free speech and “marginalized religious communities.”
by Daniela Bovolenta
It seems that new laws are continuously proposed in these days under the banner of preventing terrorism. Combating terrorist organizations is a laudable enterprise, but the danger is to pass rules that would restrict the freedom of organizations that have nothing to do with terrorism.
This is the opinion expressed in a letter of more than 60 human rights watchdogs and associations of journalists, including Human Rights Watch and European Digital Rights (itself a coalition of 44 NGOs) about the final draft of the European Union Regulation on Preventing the Dissemination of Terrorist Content Online. The text is expected to be voted by the European Parliament on April 28.
The letter urges members of the European Parliament to reject the draft. It argues that it will encourage Internet giants such as Facebook, Twitter, and Google to be even more draconian than they actually are in eliminating materials that may get them into trouble, without even explaining to the users why they have been removed.
The letter raises three specific questions. The first is that the draft regulation asks the Internet providers to remove the objectionable content “within one hour” from being notified. Since this is impossible, or very difficult, what will happen, the letter says, is that Internet giants will be encouraged to use, even more than they already do now, upload filters and other automated tools to prevent publication of content some may regard as “terrorist.” These automatic tools, as we all know, are never totally “intelligent.” To be on the safer side, social media and search engines would probably base their filters on a broad definition of terrorism.
“Because it is impossible for automated tools to consistently differentiate activism, counter-speech, and satire about terrorism from content considered terrorism itself, the letter states, increased automation will ultimately result in the removal of legal content like news content and content about discriminatory treatment of minorities and underrepresented groups.”
Second, each EU member state may designate a “competent authority” that may order content removed—in one hour. True, this means member states of the European Union rather than China or Russia. While the letter is concerned about authoritarian trends in some EU member states, we all know that totalitarian regimes have their “friends” in the European Union, which may share their designation as terrorist of organizations others see as fighting for freedom.
Third, each EU member state can issue a cross-border order that would affect all the European Union, if not the whole world, since it may be less expensive for Internet giants to delete the objected content altogether rather than blocking access to it in the European Union only.
Interestingly, Chloé Berthélémy, policy advisor at European Digital Rights, mentioned in an interview “marginalized religious minorities” as potential victims of measures that “will be easily manipulated for political censorship by unscrupulous governments.” Although Ms. Berthélémy offered Muslims as an example, groups labeled as “cults” or disliked by certain governments may be equally at risk.
Update: On April 29, 2021, despite objections by NGOs the regulation was passed by the European Parliament.