Digital constitutionalism to solve the legitimacy crisis of content moderation

Niclas Johann

June 2022

When I first heard about YouTube, Twitter and co. kicking Russian state media off their platforms in response to disinformation about the Russian invasion of Ukraine, I felt substantial unease. Not because I thought they did not deserve it but because it was yet another example of how online platforms have near-absolute discretion in their content moderation decisions. This untamed power has caused a legitimacy crisis. But digital constitutionalism can help to solve it.

There is no doubt that social media platforms have taken a central role in shaping the content people around the world consume day to day. This includes news about recent events. Online platforms are in an unprecedented position of power – for private actors – to dictate what gets seen and what does not.

Facebook
Twitter
LinkedIn

My platform, my rules

Now, as we all know, with great power comes great responsibility. But that is apparently not the case for social media companies. Through careful narrative crafting they have managed to largely hold on to the liability exemptions granted by lawmakers in the early days of the internet. With the exception of some illegal content, for instance by terrorist organisations, they can freely dictate the terms and conditions for which speech is allowed and which is not. For example, Facebook unilaterally decided that death threats against Russian soldiers would no longer amount to a violation of its rules.

When and how social media companies choose to enforce their own rules is also essentially up to them. Except for the occasional public outcry and some relentless civil society organisations, platforms do not face much external scrutiny with regard to their content moderation.  

This lack of accountability and legitimacy has substantially eroded the public’s trust in these online platforms. Social media companies have (wrongly) been accused of bias in enforcing their terms and conditions against conservative politicians. On other occasions, prominent political figures have not been punished for violations due to the risk of repercussions. Despite repeated violations of Facebook’s terms of service throughout his presidency, Donald Trump was only banned after his election defeat and the capitol was stormed. 

If left unaddressed, this lack of legitimacy in content moderation practices could further deepen the mistrust in already-divided societies. Much of the political discourse is now happening on social media. That is why it is essential platforms support rather than hinder democratic processes through fair content moderation and respect for the rule of law.

So, what can be done? Looking back in history to the last time people became discontent with the unregulated concentration of power in the hands of a few kings and queens can provide a starting point. It was the birth of constitutionalism. 

The ideas of consent of the governed, equal and predictable enforcement of laws, and due process may predate smartphones and social media by a couple of hundred years. But they can still help guide our discussions about how to legitimise content moderation practices in our digital age.

Constitutional ideas in the digital age

Digital constitutionalism is a concept that aims to adopt constitutional standards for the digital world. It thus provides us with a set of principles and values to guide our responses to the challenges of digital technologies. In content moderation online, we need to ensure that the principles put into practice offline can be enforced online. This includes transparency about how rules are created, fair and equal processes for how takedown decisions are reached and the option for users to challenge a decision they disagree with before an independent body.   

Due process is essential to justify content moderation decisions towards people even if they do not agree with the final outcome.  

Laying out these procedures in advance is all the more important in situations when platforms need to call shots fast – for example in times of conflict. We cannot spend ages deliberating the best approach when lives are at risk. Constitutionalising content moderation would give platforms a framework to rely on and justify their decisions towards the public.  

Simultaneously, it would ensure that the fundamental rights and freedoms of users are not unduly infringed in response to a conflict. Decisions such as the deplatforming of RussiaToday need to be accompanied by an impact assessment on the right to freedom of information. Allowing death threats against Russian soldiers might not be justifiable under this approach. 

Critics might say that online platforms would never voluntarily limit their power over their ‘kingdoms’ so significantly. However, legislative initiatives such as the Digital Services Act show that governments can mandate the constitutionalisation of content moderation. And importantly, they can do so without simply handing power from private companies to state actors.  

A genuine democratisation of the digital kingdoms is possible. Digital constitutionalism should be the starting point for all future social media regulation and should be used to re-evaluate existing policies. The EU should take a global leadership role in this regard. It has the power to inspire regulators in other countries and champion democratic values in our digital society

Just as some hundred years ago, change will not happen overnight. But if successful it would rebuild trust in central aspects of our digital life. Even if people do not agree with a particular decision by Facebook and co. they could at least feel at ease knowing that it was reached through a legitimate process.  

The author

Niclas Johann