Freedom of Expression Online

Freedom of Expression Online

The resources on this Module focus on some of the complex issues related to the digital exercise of freedom of expression. Internet, social media, search engines have largely transformed expression, information, communication. The selected readings highlight the mismatch between practices and the law trying to catch up with the advances of the technology, while seeking to make sense of the normative cacophony.

10 items found, showing 31 - 10

Content Regulation and Censorship

Author: Kate Jones
Media Type Icon

“There is a widespread desire to tackle online interference with elections and political discourse. To date, much of the debate has focused on what processes should be established without adequate consideration of what norms should underpin those processes. Human rights law should be at the heart of any discussion of regulation, guidance, corporate or societal responses. The UN Secretary- General’s High-level Panel on Digital Cooperation has recently reached a similar conclusion, stating ‘there is an urgent need to examine how time-honoured human rights frameworks and conventions should guide digital cooperation and digital technology’. This paper attempts to contribute to this examination. Chapter 2 of this paper clarifies terms and concepts discussed. Chapter 3 provides an overview of cyber activities that may influence voters. Chapter 4 summarizes a range of responses by states, the EU and digital platforms themselves. Chapter 5 discusses relevant human rights law, with specific reference to: the right to freedom of thought, and the right to hold opinions without interference; the right to privacy; the right to freedom of expression; and the right to participate in public affairs and vote. Chapter 6 offers some conclusions, and sets out recommendations on how human rights ought to guide state and corporate responses.”

Kate Jones. “Online Disinformation and Political Discourse: Applying a Human Rights Framework”. 2019. https://www.chathamhouse.org/sites/default/files/2019-11-05-Online-Disinformation-Human-Rights.pdf

Author: Nani Jansen Reventlow, Jonathon Penney, Amy Johnson, Rey Junco, Casey Tilton, Kate Coyer, Nighat Dad, Adnan Chaudhri, Grace Mutung’u, Susan Benesch, Andres Lombana-Bermudez, Helmi Noman, Kendra Albert, Anke Sterzing, Felix Oberholzer-Gee, Holger Melas, Lumi Zuleta, Simin Kargar, J. Nathan Matias, Nikki Bourassa, Urs Gasser
Media Type Icon

"This collection of essays includes perspectives on and approaches to harmful speech online from a wide range of voices within the Berkman Klein Center community. Recognizing that harmful speech online is an increasingly prevalent issue within society, we intend for the collection to highlight diverse views and strands of thought and to make them available to a wide range of audiences."

Nani Jansen Reventlow, et al., Perspectives on Harmful Speech Online. Berkman Klein Center for Internet & Society Research Publication, 2016.

Author: Luca Belli and Nicolo Zingales (eds)
Media Type Icon

“This book is the Official 2017 Outcome of the UN IGF Dynamic Coalition on Platform Responsibility (DCPR), which is a multistakeholder group fostering a cooperative analysis of online platforms’ responsibility to respect human rights, while putting forward solutions to protect platform-users’ rights. This book offers responses to the DCPR’s call for multistakeholder dialogue, made ever more pressing by the diverse and raising challenges generated by the platformisation of our economy and, more generally, our society. The analyses featured in this book critically explore the human rights dimension of the digital platform debate, subsequently focusing on the governance of personal data and, lastly, suggesting new solutions for the new roles played by online platforms. This volume includes the Recommendations on Terms of Service and Human Rights, which were elaborated through a multistakeholder participatory process, facilitated by the DCPR. In accordance with the UN Guiding Principles on Business and Human Rights, the Recommendations provide guidance for terms of service that may deemed as “responsible” due to their respect of internationally agreed human rights standards.”  

Luca Belli and Nicolo Zingales (eds). “Platform Regulations: How Platforms are Regulated and How they Regulate Us”. 2017. http://bibliotecadigital.fgv.br/dspace/handle/10438/19402.

 

Author: UN Special Rapporteur David Kaye
Media Type Icon

In the report (A/73/348), the Special Rapporteur "explores the implications of artificial intelligence technologies for human rights in the information environment, focusing in particular on rights to freedom of opinion and expression, privacy and non-discrimination."

UN, Human Rights Council, Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye. Report on AI’s Impact on Freedom of Expression. A/73/348. 29 August 2018.

Author: UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (David Kaye)
Media Type Icon

“In the first-ever UN report that examines the regulation of user-generated online content, the Special Rapporteur examines the role of States and social media companies in providing an enabling environment for freedom of expression and access to information online. In the face of contemporary threats such as “fake news” and disinformation and online extremism, the Special Rapporteur urges States to reconsider speech-based restrictions and adopt smart regulation targeted at enabling the public to make choices about how and whether to engage in online fora. The Special Rapporteur also conducts an in-depth investigation of how Internet Companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands.”

UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, David Kaye. Report on Content Regulation. A/HRC/38/35. April 2018.

Author: UN Special Rapporteur David Kaye
Media Type Icon

The report (A/HRC/38/35) examines "the regulation of user-general online content, the Special Rapporteur examines the role of States and social media companies in providing an enabling environment for freedom of expression and access to information online. In the face of contemporary threats such as “fake news” and disinformation and online extremism, the Special Rapporteur urges States to reconsider speech-based restrictions and adopt smart regulation targeted at enabling the public to make choices about how and whether to engage in online fora. The Special Rapporteur also conducts an in-depth investigation of how Internet companies moderate content on major social media platforms, and argues that human rights law gives companies the tools to articulate their positions in ways that respect democratic norms and counter authoritarian demands. The report is the culmination of a year-long series of consultations, visits to major internet companies and a wide range of State and civil society input."

UN, Human Rights Council. Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye. Report on the regulation on user-generated online content. A/38/35. 6 April 2018.

Author: UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (David Kaye)
Media Type Icon

“Threats to digital expression and Internet freedom are more pronounced than ever. Internet shutdowns have emerged as a popular means of information control. Government surveillance continues to intensify worldwide, jeopardizing the privacy and security of millions. Net neutrality – the long-held premise that all Internet data should be treated equally and without undue interference – has come under attack. In this increasingly hostile environment, what are the human rights responsibilities of the Information, Communications and Technology sector – particularly those actors that facilitate the provision of telecommunications and Internet access, and serve as gatekeepers of the digital infrastructure? To address this question, the Special Rapporteur first examines the role of States in undermining freedom of expression online, and what their obligation to protect this fundamental right entails. The Special Rapporteur subsequently evaluates the role of digital access providers – not just telecommunications companies and Internet service providers, which have become synonymous with digital access, but also non-consumer facing actors like network equipment vendors, content delivery networks, and Internet exchange points. Drawing on the United Nations Guiding Principles on Business and Human Rights and best practices in the field, the Special Rapporteur proposes concrete steps that digital access providers should take to safeguard the freedom of expression of Internet users worldwide.”

UN Human Rights Council, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression. Report on the Role of Digital Access Providers. A/HRC/35/22. March 2017.

Author: UN Special Rapporteur David Kaye
Media Type Icon

The report (A/HRC/35/22) “addresses the roles played by private actors engaged in the provision of Internet and telecommunications access. [The Special Rapporteur] begins by examining State obligations to protect and promote freedom of expression online, then evaluates the digital access industry’s roles, to conclude with a set of principles that could guide the private sector’s steps to respect human rights.” 

UN, Human Rights Council, Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye. Report on the role of digital access providers. A/HRC/35/22. 30 March 2017.

Author: Viasna
Media Type Icon

The latest report by the Human Rights Center “Viasna,” published over the summer, provides an updated review of the crackdown on rights and freedoms in Belarus, covering the period from March 2023 to March 2024. The report shows that, under the pretext of tackling extremism and terrorism, the Belarusian authorities have been amending legislation and using it to ramp up repression. The report outlines applicable international standards, surveys national legislation, explains the practice of designating individuals and legal entities as “extremist” and “terrorist,” and unpacks criminal prosecution practices employed to restrict free speech – on charges from the dissemination of fakes to “insulting government officials” to hooliganism, among many others.

Viasna. Restrictions on Freedom of Expression under the Pretext of Fighting Extremism and Terrorism. Human Rights Center “Viasna,” 2024. https://spring96.org/files/book/en/restrictions_freedom_expression_2024.pdf

Author: The Future of Free Speech (Jacob Mchangama, Natalie Alkiviadou, and Raghav Mendiratta)
Media Type Icon

“For the first time in human history, ordinary people have been given the ability to publicly share and access information instantly and globally through social media, without the mediation of traditional gatekeepers such as newspaper editors or government censors. Yet, the growth of social media has made even democracies wary of the resulting impact on the global ecosystem of news, opinion, and information. Unmediated and instant access to the global digital sphere has gone hand in hand with the amplification and global dissemination of harms, including online extremism and disinformation. With the entry into force of the Network Enforcement Act (NetzDG) in 2017, Germany became the first country in the world to require online platforms with more than 2 million users in their country to remove “manifestly illegal” content within a time period of 24 hours. Since the adoption of the NetzDG, more than 20 States around the world – including France – have adopted similar laws imposing “intermediary liability” on social media platforms. While democracies impose intermediary liability to counter online harms, ‘outsourcing’ government mandated content regulation to private actors raises serious questions about the consequences on online freedom of expression. The objective of this report is a preliminary and indicative attempt to sketch the duration of national legal proceedings in hate speech cases in selected Council of Europe States. The length of domestic criminal proceedings is then compared with the timeframe within which some governments require platforms to decide and take down hate speech under laws such as the NetzDG. Due to the nature of the relevant data, the following comparison between national criminal proceedings and time limits under government mandated notice and take down regimes is merely indicative and preliminary. Nevertheless, it is hoped that it may contribute to answering the question of how to develop time limits that are consistent with a meaningful assessment of the free speech interests of users of large social media platforms. A question essential to the future of online free speech.”

The Future of Free Speech, Jacob Mchangama, Natalie Alkiviadou, and Raghav Mendiratta. “Rushing to Judgment: Are Short Mandatory Takedown Limits for Online Hate Speech Compatible with the Freedom of Expression?”. 2021. https://futurefreespeech.com/wp-content/uploads/2021/01/FFS_Rushing-to-….