Answers to the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression in regard to the Act to Improve En-forcement of the Law in Social Networks (Network Enforcement Act), provided by the Federal Government of Germany

Reference: Letter, dated 1 June 2017

The concerns raised in the communication refer to the draft law on the Act to Improve En-forcement of the Law in Social Networks (Network Enforcement Act) of 14 March 2017.

First of all, the Federal Government would like to set out the reasons why this draft was deemed necessary. In 2015, the increasing spread of hate crime on the internet (especially on social networks such as Facebook, YouTube and Twitter) became ever more serious. Not only hate speech, defamation and malicious gossip were an issue but also the spread of “fake news”.

Freedom of expression and press freedom are the cornerstones of open and free society. Nevertheless, freedom of expression ends at the point where criminal law begins. Criminally punishable hate speech and lies should have as little place in social media as in every other area of society. The internet shapes the culture of debate and the overall social climate in our country. What is more, verbal radicalisation is often the first step towards physical violence. Anyone who disseminates punishable content on the internet must be held to account – and not just by the justice system. The platform operators also bear responsibility.

Efforts have long been made to raise awareness about these issues amongst platform op-erators with very little success. Voluntary commitments undertaken by the companies have only led to some initial improvements. But more is required. Criminally punishable content is still not being deleted in sufficient quantities following complaints filed by concerned individu-als. Therefore, the Federal Government has concluded that legislative action is necessary.

During the legislative process, the draft has been thoroughly debated. At the committee stage, a public hearing with several independent experts took place. As a result, the draft has undergone a number of specific changes which minimize the risk of overblocking, thus safe-guarding freedom of speech. The German Parliament approved a final version on 30 June 2017, which passed the Bundesrat on 7 July 2017 and will now be enacted.

In comparison to the initial draft, the law has already undergone several changes.

 Art. 1 para 1 now clarifies that platforms for individual communications (messaging services) or used for the dissemination of specific contents are not covered by the definition of “social network”.

 The obligation to report introduced by Art. 2 only applies to platform operators who receive more than 100 complaints per year; reports are now required semi-annually rather than quarterly.

 In Art. 3 para 2 No. 3 the rigid deadline of seven days for removing illegal content has been removed. Instead, the operator is required to remove content without delay, and in general within 7 days. The timeframe of 7 days may be exceeded if the legality of the content in question depends on the veracity of a statement of facts or on other factual circumstances; in these cases the operator may give the user in question the opportunity to comment.

 Most importantly, the deadline of seven days does not apply if within the same time the operator forwards the complaint to a self-regulatory body set up under Art. 3 para 6. This self-regulatory body will have to be independent, competent and provided with appropriate rules of procedure, including a complaints mechanism to guarantee the put back of content that has been removed erroneously – without all the costs and burdens that would come along with litigation in courts.

 The voiced concerns of providing access to subscriber data without prior court ap-proval have been met by introducing the requirement of a court decision prior to surrendering personal data.

These changes address many of the concerns that have been voiced. Since the German Constitution and the European Convention on Human Rights protect freedom of expression, there will also be effective remedies for anyone wishing to challenge any actions based on the new law.

With regard to the concerns about the vagueness of some definitions, the Federal Govern-ment points to the fact that Art. 1 para 3 defines illegal content exclusively by reference to provisions of the German Criminal Code. In the context of criminal law, the German Constitu-tion and the European Convention on Human Rights alike require that any definition must be sufficiently clear, accessible and foreseeable. By referring to the provisions of the German
Criminal Code, the law incorporates these restrictions. Only content that falls within the ambit of the definitions of the provisions of the Criminal Code can be considered illegal under the law.

In addition, the Federal Government wishes to make the following points in response to the communication:

The duty of platform operators to remove illegal content as well as the States’ duty to protect victims of hate speech and criminal attacks on the Internet by enforcing this duty is fully con-sistent with international law. It is well understood that platforms do not operate outside the law. Legislation enforcing respect for the law with regard to privacy, anti-discrimination and protection against crime should not be equated with censorship. The same rationale must nowadays apply to social networks.

It should also be understood that these duties are already incumbent on platform operators. The new law regulates their obligations with regard to handling complaints against illegal content. In this respect, the proportionality of the fines for non-compliance does in the Feder-al Government’s view not pose any problem. It should be understood that fines will not apply with regard to individual posts but only where a provider fails either to properly organize a compliance system or to fulfil the reporting or supervising obligations. Also, such fines are to be determined according to the individual case and are subject to judicial review (which in-cludes a proportionality test). It should in this context be noted that quarterly profits of the platform operators in question are exceedingly high and have reached figures beyond 3 bil-lion dollars. The upper limit of the fines provided for in the law has to take this into account. The fines will only serve their purpose if their assessment can take the economic potential of the offender into account.

The Federal Government recognizes the discussions about anonymity on the internet. While anonymity can be important for freedom of expression, the Report quoted (A/HRC/29/32) recognizes in its para 47, that States have the right to put limitations on the right to anony-mous expression where necessary to achieve legitimate objectives, such as the fight against hate speech and criminal content on the Internet. Likewise, the European Court of Human Rights in its case-law has made abundantly clear that hate speech is intolerable in a demo-cratic society.