Internet companies should remove terrorist content quickly, at the latest one hour after receiving an order from the authorities, Civil Liberties Committee agreed.
With 35 votes to 1 and 8 abstentions, the Civil Liberties Committee approved on Monday a draft piece of legislation to tackle hosting services being misused to publicly disseminate terrorist content online across the EU. Hosting service providers that systematically and persistently fail to abide by the law may be sanctioned with up to 4% of their global turnover.
No obligation to monitor or filter all content
Once an internet company hosting content uploaded by users (like Facebook or YouTube) that offers their services in the EU has received a removal order from the competent national authority, they will have one hour to do so. But they will not be generally obliged to monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity.
If a company has been subject to a substantial number of removal orders, the authorities may request that it implements additional specific measures (e.g. regularly reporting to the authorities, or increasing human resources). The Civil Liberties Committee voted to exclude from these measures the obligation to monitor uploaded content and the use of automated tools.
Moreover, any decision in this regard should take into account the size and economic capacity of the enterprise and “the freedom to receive and impart information and ideas in an open and democratic society”, MEPs agreed.
To help smaller platforms, MEPs decided that the competent authority should contact companies that have never received a removal order to provide them with information on procedures and deadlines, at least 12 hours before issuing the first order to remove content that they are hosting.
What is terrorist content?
The legislation targets any material -text, images, sound recordings or videos- that “incites or solicits the commission or contribution to the commission of terrorist offences, provides instructions for the commission of such offences or solicits the participation in activities of a terrorist group”, as well as content providing guidance on how to make and use explosives, firearms and other weapons for terrorist purposes.
Content disseminated for educational, journalistic or research purposes should be protected, according to MEPs. They also make clear that the expression of polemic or controversial views on sensitive political questions should not be considered terrorist content.
“There is clearly a problem with terrorist material circulating unchecked on the internet for too long. This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively. Any new legislation must be practical and proportionate if we are to safeguard free speech. Without a fair process we risk the over-removal of content as businesses would understandably take a safety first approach to defend themselves. It also absolutely cannot lead to a general monitoring of content by the back door,” said rapporteur Daniel Dalton (ECR, UK).
“A disproportionately high number of shares and views of terrorist content happens within the first hours of the upload. We therefore made sure that this one-hour rule stays in the Report. The Left unfortunately removed important rules from the legislation, such as the possibility to refer content online by authorities to online platforms or the compulsory tracking and removing of terrorist posts by online platforms themselves. We will continue to seek for these mechanisms in the future. The fight against online propaganda, today’s most common root cause of radicalisation, is too important to be taken as a hostage of pre-election campaigning. We will not allow this to happen,” said the EPP Group’s Spokeswoman on the file Rachida Dati MEP.
The full Chamber will vote on the draft law next week. The new Parliament, formed after the European elections, will be in charge of negotiating with the Council of Ministers on the final form of the legislation.