The end of an era for self-regulation? EDAP, political advertising and the role of social media in Europe


The events at the US Capitol in January of this year may well signal a sea change in the regulatory scrutiny of social media platforms worldwide. While several prominent platforms had already moved to restrict political content, many criticised the role of social media platforms in stoking dissention. This, in turn, has renewed regulatory concern in the EU about the impact of misinformation, paid political advertising and other forms of online political activity on democratic integrity.

Upcoming elections in Europe will provide another test of political attitudes to the role of platforms in such political activity. As French and German authorities prepare for national elections in 2021, platforms’ political content policies will be put under the magnifying glass by regulators, as they seek to assess their impact on electoral processes and the potential need for more stringent obligations. This scrutiny is likely to be accentuated by widespread allegations of interference in past European elections, such as the 2017 French presidential election, and will be coloured by a prevailing sentiment within Europe that self-regulation is no longer sufficient. 

A significant component of the EU’s stance will be the European Democracy Action Plan (EDAP) - a combination of legislative and non-legislative measures which will seek to safeguard EU electoral processes from ‘malign interference…foreign or domestic’, with a particular focus on political advertising. The European Commission has also initiated a public consultation on political advertising, inviting the submission of views on issues including political advertising regulation, targeting and transparency requirements until April 2021. 

Some social media platforms have tried to head off regulation with proactive action. TikTok was the first major platform to ban political advertising content in October 2019, describing it as ‘not something [that]…fits the TikTok platform experience’. Several weeks later, Twitter followed suit, prohibiting paid political content and any advertising alluding to political actors, processes or support.

Facebook, Instagram and Google have restricted paid political content however such advertisements are conditionally permitted. Instagram and its parent platform, Facebook, for instance, exclusively employ an ‘ad authorisation process’, requiring content to feature a ‘Paid for by’ disclaimer. In January 2021, Privacy International, a UK-based digital rights charity, published an open letter to Facebook and Google calling on the platforms to ‘accelerate the expansion of full transparency tools’ in an internationally consistent approach regarding paid political content.

The principal political view in Europe is that these measures are deficient. Therefore, we can expect the introduction of legislation regulating political advertising and overhauling the existing voluntary Code of Practice on Disinformation into a more robust co-regulatory framework of obligations and platform accountability in the coming months, including the establishment of an implementation monitoring system in Spring 2021. 

Although the EDAP represents the most obvious vehicle for implementing changes to political advertising online, MEPs may turn to the concurrent Digital Services Act (DSA) to strengthen online political regulation. They will also address other online advertising techniques, including microtargeting. If this year’s elections are seen as a failed test case for the potential to keep perceived misinformation out of political advertising, European policymakers can be expected to look for ways to tighten the regulatory framework further. These legislators are increasingly settling on the conclusion that platforms must be held accountable for the content they help to propagate. In a broader sense, the narrative that platforms possess an element of editorial responsibility for published content will feed into the ongoing examination of the role of social media as platforms or publishers, which could serve as a prelude to establishing deeper obligations on user-generated content more generally.

Download this Insight here.

You can listen to the GC team discussing anti-trust laws and the regulation of Big Tech on a recent episode of the Global Counsel Podcast.


The views expressed in this note can be attributed to the named author(s) only.