Freedom of Expression Online

Freedom of Expression Online

The resources on this Module focus on some of the complex issues related to the digital exercise of freedom of expression. Internet, social media, search engines have largely transformed expression, information, communication. The selected readings highlight the mismatch between practices and the law trying to catch up with the advances of the technology, while seeking to make sense of the normative cacophony.

10 items found, showing 31 - 10

Content Regulation and Censorship

Author: UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (David Kaye)
Media Type Icon

“In the first-ever UN report that examines the regulation of user-generated online content, the Special Rapporteur examines the role of States and social media companies in providing an enabling environment for freedom of expression and access to information online. In the face of contemporary threats such as “fake news” and disinformation and online extremism, the Special Rapporteur urges States to reconsider speech-based restrictions and adopt smart regulation targeted at enabling the public to make choices about how and whether to engage in online fora. The Special Rapporteur also conducts an in-depth investigation of how Internet Companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands.”

UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye. Report on Content Regulation. A/HRC/38/35. April 2018.

Author: UN Special Rapporteur David Kaye
Media Type Icon

The report (A/HRC/38/35) examines "the regulation of user-general online content, the Special Rapporteur examines the role of States and social media companies in providing an enabling environment for freedom of expression and access to information online. In the face of contemporary threats such as “fake news” and disinformation and online extremism, the Special Rapporteur urges States to reconsider speech-based restrictions and adopt smart regulation targeted at enabling the public to make choices about how and whether to engage in online fora. The Special Rapporteur also conducts an in-depth investigation of how Internet companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands. The report is the culmination of a year-long series of consultations, visits to major internet companies and a wide range of State and civil society input."

UN, Human Rights Council. Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye. Report on the regulation on user-generated online content. A/38/35. 6 April 2018.

Author: UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (David Kaye)
Media Type Icon

“Threats to digital expression and Internet freedom are more pronounced than ever. Internet shutdowns have emerged as a popular means of information control. Government surveillance continues to intensify worldwide, jeopardizing the privacy and security of millions. Net neutrality – the long-held premise that all Internet data should be treated equally and without undue interference – has come under attack. In this increasingly hostile environment, what are the human rights responsibilities of the Information, Communications and Technology sector – particularly those actors that facilitate the provision of telecommunications and Internet access, and serve as gatekeepers of the digital infrastructure? To address this question, the Special Rapporteur first examines the role of States in undermining freedom of expression online, and what their obligation to protect this fundamental right entails. The Special Rapporteur subsequently evaluates the role of digital access providers – not just telecommunications companies and Internet service providers, which have become synonymous with digital access, but also non-consumer facing actors like network equipment vendors, content delivery networks, and Internet exchange points. Drawing on the United Nations Guiding Principles on Business and Human Rights and best practices in the field, the Special Rapporteur proposes concrete steps that digital access providers should take to safeguard the freedom of expression of Internet users worldwide.”

UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression. Report on the Role of Digital Access Providers. A/HRC/35/22. March 2017.

Author: UN Special Rapporteur David Kaye
Media Type Icon

The report (A/HRC/35/22) “addresses the roles played by private actors engaged in the provision of Internet and telecommunications access. [The Special Rapporteur] begins by examining State obligations to protect and promote freedom of expression online, then evaluates the digital access industry’s roles, to conclude with a set of principles that could guide the private sector’s steps to respect human rights.” 

UN, Human Rights Council, Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye. Report on the role of digital access providers. A/HRC/35/22. 30 March 2017.

Author: The Future of Free Speech (Jacob Mchangama, Natalie Alkiviadou, and Raghav Mendiratta)
Media Type Icon

“For the first time in human history, ordinary people have been given the ability to publicly share and access information instantly and globally through social media, without the mediation of traditional gatekeepers such as newspaper editors or government censors. Yet, the growth of social media has made even democracies wary of the resulting impact on the global ecosystem of news, opinion, and information. Unmediated and instant access to the global digital sphere has gone hand in hand with the amplification and global dissemination of harms, including online extremism and disinformation. With the entry into force of the Network Enforcement Act (NetzDG) in 2017, Germany became the first country in the world to require online platforms with more than 2 million users in their country to remove “manifestly illegal” content within a time period of 24 hours. Since the adoption of the NetzDG, more than 20 States around the world – including France – have adopted similar laws imposing “intermediary liability” on social media platforms. While democracies impose intermediary liability to counter online harms, ‘outsourcing’ government mandated content regulation to private actors raises serious questions about the consequences on online freedom of expression. The objective of this report is a preliminary and indicative attempt to sketch the duration of national legal proceedings in hate speech cases in selected Council of Europe States. The length of domestic criminal proceedings is then compared with the timeframe within which some governments require platforms to decide and take down hate speech under laws such as the NetzDG. Due to the nature of the relevant data, the following comparison between national criminal proceedings and time limits under government mandated notice and take down regimes is merely indicative and preliminary. Nevertheless, it is hoped that it may contribute to answering the question of how to develop time limits that are consistent with a meaningful assessment of the free speech interests of users of large social media platforms. A question essential to the future of online free speech.”

The Future of Free Speech, Jacob Mchangama, Natalie Alkiviadou, and Raghav Mendiratta. “Rushing to Judgment: Are Short Mandatory Takedown Limits for Online Hate Speech Compatible with the Freedom of Expression?”. 2021. https://futurefreespeech.com/wp-content/uploads/2021/01/FFS_Rushing-to-….

Self-regulation and ‘hate speech’ on social media platforms

Author: ARTICLE 19
Media Type Icon

Article 19. Self-regulation and ‘hate speech’ on social media platforms. London: Article 19, 2018.

"In this brief, ARTICLE 19 seeks to contribute to discussions on greater regulation of social media platforms, including calls for such platforms to be considered publishers. We do so by exploring a possible model for the independent and effective self-regulation of social media platforms."

Author: Article 19
Media Type Icon

"Offering a concise overview of the current state of content moderation on the largest social media platforms and the impacts on freedom of expression, the practical handbook seeks to dissect the complex intersection of freedom of expression, content moderation, and the business models of these tech giants.

The handbook, produced by ARTICLE 19 under the UNESCO project Social Media 4 Peace funded by the European Union, includes numerous concrete examples and cases to illustrate the questions raised by different standards, practices and policies pertinent to content moderation. It builds upon ARTICLE 19’s policies and expertise in content moderation and platform regulation and reflects ARTICLE 19’s long-standing calls that measures responding to problematic content including ‘disinformation’ and ‘hate speech’ must always conform with international standards on freedom of expression and other human rights."

Article 19. 'Social Media 4 Peace: Content moderation and freedom of expression handbook'. 2023. https://www.article19.org/wp-content/uploads/2023/08/SM4P-Content-moderation-handbook-9-Aug-final.pdf

Author: Masaar
Media Type Icon

Published by Masaar, a community of lawyers and technologists advancing digital rights in Egypt, the article explains “the Fediverse” as a challenge to the concentration of Internet power in the hands of few tech companies. Two core ideas build the Fediverse: 1) decentralization and 2) federalism. The article dives into those and gives an overview of the Fediverse’s technological foundation, its philosophy, objectives, first application and evolution. The article also lists some of the networks currently running - Mastodon, PeerTube, Diaspora, and Pixelfed - and discusses the Fediverse’s future along with the challenges it entails, such as difficulty in attracting users, lack of sustainability guarantees, and security threats. The paper concludes on an optimistic note, encouraging Internet users to try a Fediverse application: “Building a free Internet is the only way for it to support its users’ rights and freedoms. Thus, tools like the Fediverse are very important for the future of the Internet and accordingly for the future of us all.”

Masaar. “Social Media Platforms In The Age Of The Fediverse.” March 6, 2024. https://masaar.net/en/social-media-platforms-in-the-age-of-the-fediverse/

Author: Carnegie Council for Ethics in International Affairs, David Kaye
Media Type Icon

“The Internet was designed to be a kind of free-speech paradise, but it has also been used to incite violence, spread lies, and promote hate. Over the years, three American behemoths – Facebook, YouTube, and Twitter – became the way many people around the world experience the Internet, and therefore act as the conveyors of some of its most disturbing material. Who should decide whether content should be removed from platforms, or which users should be kicked off? Should the giant social media platforms police the content themselves, as is the norm in the U.S., or should governments and international organizations regulate the Internet, as many are demanding in Europe? How do we keep from helping authoritarian regimes to censor all criticisms of themselves? David Kaye is the United Nations Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, the global body’s principal monitor for freedom of expression issues worldwide. He is also a clinical professor of law and the director of the International Justice Clinic at the University of California, Irvine.”

Carnegie Council for Ethics in International Affairs, David Kaye. “Speech Police: The Global Struggle to Govern the Internet”. June 2019. https://www.youtube.com/watch?v=W6PDZ-o5Khg.

Author: UN, OSCE, OAS and ACHPR Special Rapporteurs for Freedom of Expression
Media Type Icon

The Special Rapporteurs identify the ten key challenges to freedom of expression in the next decade: Mechanisms of Government Control over the Media, Criminal Defamation, Violence Against Journalists, Limits on the Right to Information, Discrimination in the Enjoyment of the Right to Freedom of Expression, Commercial Pressures, Support for Public Service and Community Broadcasters, Security and Freedom of Expression, Freedom of Expression on the Internet, Access to Information and Communications Technologies.

UN, OSCE, OAS and ACHPR Special Rapporteurs for Freedom of Expression. Tenth Anniversary Joint Declaration: Ten key challenges to freedom of expression in the next decade2 February 2010.