Last updated on March 9, 2021
There should apparently not be any mandatory upload filter against terrorist content. Nevertheless, it should be possible to order proactive measures.
The EU has agreed on new rules to combat terrorist content on the net. As the EU Commission announced in Brussels on Thursday, the new regulation not only obliges internet services to quickly delete such content within one hour. They can also be forced to “proactively” counter abuse of their services. “Companies will decide on the choice of measures”, the Federal Ministry of the Interior announced. According to MEP Patrick Breyer (pirates), an obligation to use upload filters has been prevented.
The exact wording of the agreement reached is not yet available. In addition, the agreement still has to be officially approved by the European Parliament and the member states.
However, one thing is certain: In future, public authorities will be able to initiate fire extinguishing orders throughout the EU. According to the Ministry of the Interior, national authorities will be authorised to “order the deletion of terrorist content, regardless of where the company is based in the EU”. The authorities of the country in which the company is based will be involved in this process. However, the providers can also claim effective legal protection in their home country against a foreign order without major hurdles, said SPD member of parliament Petra Kammerevert.
Resistance to upload filter
The European Parliament had already accepted the statutory quick deletion in April 2019. However, MEPs had rejected the Commission’s proposal that providers should use “automated tools” to prevent the re-upload of illegal content.
Laut Breyer there is now “a clearly formulated waiver of the obligatory use of error-prone upload filters”. The algorithms, which could not reliably distinguish terrorist propaganda from the legitimate use of pictures and videos, would, however, be widely used voluntarily.
According to Kammerevert, scepticism is still appropriate. “For example, journalistic or artistic content as well as polemical or satirical expressions of opinion are expressly excluded from the scope of application. (…) Nevertheless, the state may examine without cause whether contents actually serve these purposes or whether they are only used as a pretext to spread terrorist content, even without any valid evidence”, said Kammerevert. Therefore, the regulation, which is directly applicable law in the Member States, “definitely encroaches significantly on basic communication rights” Seehofer threatens companies
Federal Interior Minister Horst Seehofer (CSU) welcomed the agreement. “The spread of terror is not protected by freedom of opinion and the Internet is not a lawless space. This responsibility does not only fall on the state. We are also making sure that no Internet company can evade its responsibility”, Seehofer said. According to the EU Commission, member states can impose sanctions if companies do not meet their obligations.
Fines should, however, take into account the size of the enterprise and should not be so high for small, medium and micro enterprises. According to Breyer, the trialogue negotiations agreed to “exclude fines for providers who, for technical or operational reasons, cannot comply with a fire extinguishing order within one hour”.
The EU Commission had presented its controversial draft regulation in September 2018 . The proposal largely corresponded to the non-binding recommendations proposed by the EU Commission in March 2018. However, Seehofer and his French counterpart Gérard Collomb had already called on the EU Commission in April 2018 to come up with a legal regulation.
At the beginning of August 2018, the Commission then announced that a rapid legislative proposal was planned. After the assassination attempt on a mosque in Christchurch, New Zealand, the question was also raised how to prevent the distribution of attack videos.
All platforms with user-generated content affected
the Regulation affects a>a> all Internet companies “providing services in the EU, regardless of their size and their global headquarters”. The draft itself refers to “hosting service providers” (Article 2). However, this does not only include webhosts or larger platforms such as Facebook, but all providers that store and make available information from “content providers”. However, “content provider” is defined as any “user who has provided information that has been stored or is stored on his behalf by a hosting service provider”, it continues. This means that any provider with user-generated content would have to comply with the requirements of the law. “Terrorist content” includes, inter alia, “incitement to or advocacy of terrorist offences, including by glorification, with the consequent risk that such offences may be committed” or “encouragement to participate in terrorist offences”. It also includes “technical instructions or methods for committing terrorist offences”.
Read the original article here.