30 November 2021

On 30 November 2021, EIF organised a debate on ‘The limits of self-regulation in tackling harmful content’ to explore and assess the challenges of a self-regulatory approach when it comes to tackling harmful content online.
The debate, hosted by MEP Tiemo Wölken and moderated by EIF Director General Maria Rosa Gibellini, featured the following experts:

  • Marco Pancini, Government Affairs and Public Policy Director for Europe, YouTube
  • Lisa Ginsborg, Acting Secretary General, European Digital Media Observatory
  • Gina Nieri, member of the European Commission High-Level Expert Group on fake news, Mediaset
  • Carlos Hernández-Echevarría, Head of Public Policy & Institutional Development, Maldita.es
  • Daniel Braun, Deputy Head of Cabinet, European Commission Vice President Věra Jourová
57:55
The limits of self-regulation in tackling harmful content

According to Tiemo Wölken MEP, there is not enough discussion around how to tackle disinformation: harmful content often sits right at the border of what is legal and what is illegal. This makes any binding rules very sensitive and we should not limit fundamental rights like the freedom of expression.

So far, the main solution proposed has been the introduction of voluntary codes of conduct, but those instruments have their limits when it comes to enforcement and implementation around Europe. Moreover, MEP Wölken believes that codes of conduct can be a powerful tool to tackle disinformation, but they should never replace hard law, which was adopted through a democratic process. More action should be taken against organized disinformation outlets’ attempts to manipulate our democratic debate.
MEP Wölken welcomed the European Commission’s Media Freedom Act.

Marco Pancini reassured that YouTube wants to be part of the solution. Filtering out issues like disinformation and misinformation isn't something that YouTube can do alone: this is why they are working in collaboration with partners in every country to innovate and create ways to counteract fake news online.

The two pillars of YouTube’s strategy against misinformation are (1) stopping the spread of misinformation via content removal according to the community guidelines and (2) connecting people with quality information, increasing the good and decreasing the bad, raising up information from trusted sources and reducing the spread of videos with harmful misinformation. Collaboration is key to success in the fight for quality information.

Lisa Ginsborg seconded the opinion that the Code of Practice against Disinformation represented a natural and very important first step. A number of problems emerged in the implementation of the Code: lack of common definitions, need for clearer procedures and for more precise and comprehensive commitments, absence of meaningful KPIs, lack of access to data and appropriate monitoring. Access to data is of particular concern for EDMO, together with the lack of independent oversight and independent auditing.

Ms. Ginsborg reiterated that many of the limitations of the Code come precisely from its self-regulatory nature: the guidance aims at evolving this Code towards a co-regulatory instrument. EDMO’s approach combines the recognition of the importance of these self and co-regulatory measures with broader approaches with user empowerment, including fact-checking, media literacy initiatives, greater transparency and accountability and understanding of disinformation phenomenon through the access to data.

Gina Nieri, representing broadcasters, stated that the Code of Practice failed to deliver a meaningful result in terms of commitment and KPIs, but it was a step in the right direction, which must now be reinforced with binding obligations within the DSA. For broadcasters, the content distributed online and offline is strictly regulated, which is not the case for platforms.

In this context, broadcasters welcomed the DSA package and call on co-legislators to ensure a safe and pro-competitive online environment. Ms. Nieri thanked the European institutions for their work; broadcasters are ready to continue to exchange with the European Parliament and the European Commission to work in a direction that protects the democratic values online and offline.

Carlos Hernández-Echevarría from Maldita, one of the biggest fact-checkers in Europe, gave some credit to the self-regulatory instrument; nevertheless, he feels aligned with the shortcomings that the European Commission identified. The Code is a great first step, but self-regulation has inherent problems: it is voluntary, we miss proper KPIs, transparency, data access. The whole strengthening process should go in that direction.

For harmful content, the best response in the experience of fact-checkers is also to provide users with more information about how to recognise and tackle disinformation to critically understand why they have been targeted as users.

According to Daniel Braun, we need to rebalance the relationship between platforms and users and to make sure that the rules are defined in the public interest with democratic legitimacy, namely legislation. The European Commission believes that the Codes should continue to play a role: the question is how to combine the advantages with the limits of self-regulation. For this reason the European Commission presented a digital rulebook consisting of the DSA, DMA, European Democracy Action Plan.

Moreover, said Mr. Braun, we need to adopt a mindset that any countermeasures will only solve part of the many issues. With hard law and effective multi-stakeholder cooperation to protect the society, Mr. Braun believes we can reassert the integrity of the digital space to a large extent.

Videos

  • Global Perspectives on AI Regulation: Navigating IP Landscapes & Connecting the Political Bubbles
  • 25:42 Exchange with Roberto Viola, Director General, DG CNECT
  • 42:05 Risks of Internet Fragmentation

Related content