Should a large-scale social media platform decide to block legal content? Should a platform with more than three billion users be able to restrict the access to protected speech, to decide that criticism on Covid-19 policy is misinformation, or that euthanasia equals suicide – on a worldwide scale? The answers depend on one’s view on social media platforms and the legal system (public, private, or a combination of the two) one sees fit to apply to the realm of content moderation.
by Berdien van der Donk
When asked, neither the users of social media platforms know whether their interactions on the platform are public or private, as shown by an empirical study (n=1699) in 2020. Can platforms be compared to anything in the physical world around us? According to Elon Musk, directly after it became public that he had offered $44 billion to acquire Twitter, he tweeted that “Twitter is the digital town square where matters vital to the future of humanity are debated”. Three years earlier, in March 2019, Mark Zuckerberg reached that same conclusion, stating that “over the last 15 years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square.”
This ‘town square’-argument is not solely reserved for multibillionaire owners of large-scale social media platforms. Similarly, in an early proposal for the Digital Services Act package, the European Commission stated that these platforms have “become de facto ‘public spaces’ in the online world”, omitting any further argumentation as to why they would qualify as such.
However, social media platforms are private companies and thus – by nature – not an extension of any state. This renders the discussion on the qualification of a social media platform even more difficult, because by default, a platform does not constitute a ‘purely public space, such as many traditional town squares (in Europe)’. Private companies are - in principle - free to agree on the terms of service they want to apply. According to this point of view, social media platforms would be free to exclude whichever content they want, including content that falls within the scope of protected speech.
This interpretation of the freedom of contract is, however, incorrect. The freedom of contract does not provide a free pass to moderate content, at least not in the European Union. The freedom of contract is included in the freedom to conduct a business in article 16 of the Charter of Fundamental Rights of the European Union. This freedom is not an absolute right. A previous study by the author into the scope of this article has shown that social media platforms can be (severely) limited in the freedom to draft user terms, for example to protect society, the internal market, and the rights and freedoms of others. For the dissemination of illegal content, platforms can be held liable, impairing their “free choice” to provide access to such content.
European discrepancy on the role of social media platforms
The confusion on the qualification of social media platforms can also be observed in the recent case-law in the Member States of the European Union. In a proposal for a law regulating social media (lov om regulering af sociale medier), the Danish legislator seemingly qualified social media platforms as public fora, pointing to the double role social media platforms play in Danish society as, on the one side, interactive communication platforms, and on the other side, a source of information. Unfortunately, the law nor the preparatory works mention the origin of this argument, nor substantiate the reasoning for this normative stance.
In the Netherlands the national case-law shows definitions varying between private property and a “continuous agreement”. In the former, the District Court of Amsterdam compares social media platforms to a shopping mall – a privately owned, publicly accessible place, without a seeming proper justification as to why the Court has chosen to do so. The Dutch legislator has not taken an active stance on defining social media platforms and thus the current legal landscape remains unclear.
The German case-law has a clearer, but no less normative, approach. In July 2021, the Bundesgerichtshof (BGH) concluded in two cases (III ZR 192/20 and III ZR 179/20) that a social media platform’s service does not equal the communication function of the State, and thus does not face the same requirements to safeguard fundamental rights on the platform. Social media platforms cannot be compared to a state monopoly, nor a privatized equivalent. The BGH highlights that large-scale social media platforms offer a significant communication possibility on the internet, but they do not guarantee access to the internet as such. In the end, social media companies are private companies that can choose to open their platform to the general public.
In Italy, the Appeal’s court of Aquila concluded that “social media platforms are non-essential service” and that the scope of permissible speech is defined by the terms of service. On the assessment of the service offered by Facebook, the Appeal’s court of Aquila concludes that Facebook does not offer a “general” expression platform: it offers a service that allows the exchange of expression that does not harm other users. Thus, it was justified to remove content that would endanger the safety and well-being of other users or the integrity of the values of the Community. Previously, also the Court in Rome defined the relationship between a platform and a user as a purely private agreement.
The necessity to find a qualification for social media platforms
Seemingly, there is no consensus on a qualification for social media platforms. Therefore, defining the rights of users on these platforms is difficult, because which legal order should prevail: user terms, national private law, or fundamental rights? Defining the role of social media platforms is necessary for the interaction of these plural legal orders. If platforms are solely private companies contracting with another private party, then the (in)direct application of fundamental rights seems out of place outside the scope of traditional open norms in private law. After all, agreements between two private parties are a matter governed by private law. Opposingly, resolving content restrictions on platforms qualified as “public fora" calls for the protection of the fundamental rights of the participants in the public forum, at least indirectly.
It could be that these online communication spaces constitute something we have not yet observed in the physical world and therefore call for a qualification of their own. Though, before reaching that conclusion, other comparisons to existing concepts must be ruled out. Recalling Easterbrook’s call to “prevent a law of the horse”, taking a normative stance on a qualification of social media platforms will avoid any unnecessary “laws of the social media platform”. Such fragmentation – on a worldwide scale – will create a complicated legal framework that can potentially only be navigated and complied with by large-scale platforms.
There is not one type of social media platform
Whereas few will intuitively oppose a decision of an adult sexual platform to remove cat videos, many will have a subconscious unease or concern with the moderation of this exact type of content on large-scale social media platforms like Twitter, Facebook, or TikTok. After all, cat content rules these more general social media platforms. The answer to why certain types of online platforms have more leeway to moderate speech than others.
The Italian case-law shows that also a broad scoped expression platform like Facebook is, in fact, not an open and general platform. Facebook only allows speech upholding the “social co-existence” within the Community. Platforms can decide to be “themed”, but will have to communicate clearly to their users what is allowed within the scope of the theme. The leeway to moderate more speech than others (read: to remove cat videos) therefore boils down to the definition of the service in the user terms of the platform. The latter seems to pose problems for those platforms that offer their service as “general exchanges of opinions and ideas”.
Berdien van der Donk
Cover Photo: Xavier von Erlach / unsplash