The European Court of Justice has ruled that individual member countries can order Facebook and other platforms to remove illegal content, and copies of such illegal content, and limit access to it worldwide.

How did the court case originate? The European Court of Justice came to its decision after Austrian politician Eva Glawischnig-Piesczek requested Facebook to remove defamatory comments about her comments that an Austrian court had already found to be illegal and limit access to them globally. She also requested that comments that were identical or equivalent to the original one be removed. The request was filed with Facebook Ireland, which is Facebook’s EU headquarters, in 2016. Facebook refused to do so, and a court battle ensued.

What does the EU law say? Under the EU’s Electronic Commerce Directive, platforms such as Facebook are not liable until they have knowledge that the information is illegal, or as long as they remove or disable access to the information as soon as they become aware of it. The directive also does not require platforms like Facebook to proactively monitor their platforms for illegal content.

What did the Court rule? The ECJ has now ruled that platforms can be ordered to:

  • Remove illegal content and limit access to it worldwide “within the framework of the relevant international law”
  • Remove or block access content that is identical to or equivalent of content that was originally found to be illegal. Equivalent content means content that remains “essentially unchanged” and thus “diverges very little from the content which gave rise to the finding of illegality”.
  • Equivalent content would not require the platform to carry out an independent assessment of whether it’s illegal. And this, the ECJ rules, ensures that the obligation on platforms to find illegal content does not increase, as they only have to look for identical or equivalent content. And they can  or have to  deploy machine learning and AI to achieve this.

In the ruling, the ECJ states that EU law tries to strike a balance by stating that platforms shouldn’t be imposed with an excessive obligation to protect a person’s reputation or honour.

. . . On the other hand, that protection is not provided by means of an excessive obligation being imposed on the host provider, in so far as the monitoring of and search for information which it requires are limited to information containing the elements specified in the injunction, and its defamatory content of an equivalent nature does not require the host provider to carry out an independent assessment, since the latter has recourse to automated search tools and technologies.

Thus, such an injunction specifically does not impose on the host provider an obligation to monitor generally the information which it stores, or a general obligation actively to seek facts or circumstances indicating illegal activity, as provided for in Article 15(1) of Directive 2000/31.

The ECJ’s ruling on Right to be Forgotten last week is in direct conflict with this ruling. The top court had ruled that Google does not need to apply Europe’s right to be forgotten globally to balance right to privacy and personal data protection with freedom of information of internet users. It had also said that numerous third states don’t recognise the right to de-referencing or they have a different approach. Information harmful to a person’s reputation “is likely to have immediate and substantial effects on that person within the EU itself”, and a “global de-referencing would meet the objective of protection referred to in EU law in full,” the ECJ reasoned.