Guidance for Regulation

UNESCO emphasizes the importance of a human rights-based approach to tackling hate speech and disinformation, including through safeguarding freedom of expression.

The guidance for regulating digital platforms has the aim to supporting freedom of expression and the availability of accurate and reliable information in the public sphere.

Read the information and questions below to learn more about the guidance


Last update: 19 December 2022

The Guidance on Regulating Digital Platforms: a multistakeholder approach” is now publicly available in the conference's website. We are welcoming your inputs and observations. The guidance for regulating digital platforms has the aim to supporting freedom of expression and the availability of accurate and reliable information in the public sphere. 

Consult the document here

The deadline to receive all your inputs is 16 January 2023. Comments should be sent via email to internetconference@unesco.org. Please include as subject of your email : "Comments to guidance on Regulating Digital Platforms"

Follow the procedures below to submit your inputs:

  • Consult the guidance document on the conference website.
  • When making your inputs, please refer to the number of the paragraph you are commenting on. All paragraphs in the document are numbered.
  • Please provide as much detail as possible in your suggestion for consideration.
  • If you include any references, please provide the source and specify the information you are referring to. Given the high volume of submissions please avoid sending links or attachments. 

Language versions: French and Spanish versions of the guidance will be uploaded in the coming days.

Some approaches to regulation have (inadvertently or deliberately) led to suppressing freedom of expression or have simply proved ineffective in dealing with damaging content.

Many states have a limited regulatory capacity while others adopt approaches that are not aligned with international human-rights standards.

Evidence about the prevalence of hate speech and disinformation on social media platforms remains incomplete, partly due to a lack of transparency and data access on the part of platforms.   

However, academic studies have shown that this online content is a major phenomenon with off-line consequences for democracy and human rights.

  • Facebook “reported that between January 2021 and March 2021, there was a 0.05% to 0.06% prevalence of hate speech, showing a slight decrease compared to their two previous reports (source: Facebook)
  • A study analyzed a dataset of 183 million Parler posts from 4 million users showing that the lax moderation regime on this platform enabled conspiracy theories, violent extremist groups and coordination for storming of the U.S. capitol on January 6, 2021. (source: Aliapoulios, M. et al, 2021; Whittaker, J. et al. 2021)
  • During the European elections of 2019, 500 pages and groups on Facebook promoting disinformation received 533 million views and were liked, commented upon or shared by 67 million people (source: Steiger, D., 2021; Martin-Rozumilowicz, B y Kuzel, R., 2019)
  • In the context of elections, research has shown that “false or salacious information about women spreads further, faster and more intensely than disinformation about men (source: IFES)

  • 85,247 videos that violated YouTube’s hate speech policy removed between January and March 2021(source:U NESCO)

What is the focus of this guidance on regulation?

The "Guidance for regulating digital platforms: a multistakeholder approach"

  •  Will focus on structures and processes to help protect users from content that damages democracy and human rights, while at the same time respecting freedom of expression;
  • Does not deal with data privacy, competition, intellectual property or other such established legal rights all of which require different approaches and different legal or regulatory frameworks (including being subject to international treaties or conventions).

What happens after the conference?

The next steps and follow-up will be discussed in the Global Conference.

The "Guidance for regulating digital platforms: a multistakeholder approach" will support Member States and digital platforms that wish to review their regulation in order to adapt to pressing challenges. The Roadmap and follow up of the guidance should be considered a relevant input for other processes both global (such as the development of the Global Digital Compact or the WSIS+20 Review) or national (on-going regulatory reforms).   

 

What is the guidance´s approach?

The guidance for regulating digital platforms will:

  • Be based on principles – with the regulators setting the goals and processes, and the digital platforms fulfilling them;
  • Specify high-level issues that the companies must address;
  • Spell out some principles for regulators’ autonomy and independence;
  • Have a gender-sensitive and intersectional approach.

What high-level issues will the guidance require the platforms to report on?

The current draft of the guidance sets  out 10 areas* in which platforms would be required to report on their processes:

  • transparency requirements;
  • processes for managing content;
  • how they create an enabling environment for users;
  • user reporting mechanisms;
  • how they deal with harmful content that threatens democracy and human rights;
  • how they promote media and information literacy;
  • how they help ensure election integrity;
  • what risk assessment processes they have in place for major events;
  • how they provide for different languages and accessibility online;
  • what data access they provide to researchers.
     

*The determination of these issues is under discussion as part of the consultations.

What is the goal of the guidance for regulation ?

  • The guidance will:
    • Provide guidance in developing regulation that can help member States manage content that damages democracy and human rights while supporting freedom of expression, information and other human rights;
    • Assess whether government regulatory systems align with international human rights standards and hold governments to account for regulatory overreach; 
    • Provide real accountability for digital platforms and ‘big tech’;
    • Serve as a tool for civil society and the wider global community for holding governments (regulators, parliaments) and companies accountable for their commitments and for advocating for a regulatory system that aim to supporting freedom of expression and the availability of accurate and reliable information in the public sphere.
    • Promote more nuanced approaches as alternatives to States directly manage content online (which can lead to undue restrictions) and also to limit unchecked power by private actors.