Free Speech: A History from Socrates to Social Media


Book by Jacob Mchangama: A global history of free speech, from the ancient world to today. Hailed as the “first freedom,” free speech is the bedrock of democracy. But it is a challenging principle, subject to erosion in times of upheaval. Today, in democracies and authoritarian states around the world, it is on the retreat.

In Free Speech, Jacob Mchangama traces the riveting legal, political, and cultural history of this idea. Through captivating stories of free speech’s many defenders—from the ancient Athenian orator Demosthenes and the ninth-century freethinker al-Rāzī, to the anti-lynching crusader Ida B. Wells and modern-day digital activists—Mchangama reveals how the free exchange of ideas underlies all intellectual achievement and has enabled the advancement of both freedom and equality worldwide. Yet the desire to restrict speech, too, is a constant, and he explores how even its champions can be led down this path when the rise of new and contrarian voices challenge power and privilege of all stripes.

Meticulously researched and deeply humane, Free Speech demonstrates how much we have gained from this principle—and how much we stand to lose without it…(More)”.

Oversight Board publishes policy advisory opinion on the sharing of private residential information


Press Release by Oversight Board: “Last year, Meta requested a policy advisory opinion from the Board on the sharing of private residential addresses and images, and the contexts in which this information may be published on Facebook and Instagram. Meta considers this to be a difficult question as while access to such information can be relevant to journalism and civic activism, “exposing this information without consent can create a risk to residents’ safety and infringe on an individual’s privacy.”

Meta’s request noted several potential harms linked to releasing personal information, including residential addresses and images. These include “doxing,” (which refers to the release of documents, abbreviated as “dox”) where information which can identify someone is revealed online. Meta noted that doxing can have negative real-world consequences, such as harassment or stalking…

The Board understands that the sharing of private residential addresses and images represents a potentially serious violation of the right to privacy both for people who use Facebook and Instagram, and those who do not.

Once this information is shared, the harms that can result, such as doxing, are difficult to remedy. Harms resulting from doxing disproportionately affect groups such as women, children and LGBTQIA+ people, and can include emotional distress, loss of employment and even physical harm or death.

As the potential for harm is particularly context specific, it is challenging to develop objective and universal indicators that would allow content reviewers to distinguish the sharing of content that would be harmful from shares that would not be. That is why the Board believes that the Privacy Violations policy should be more protective of privacy.

International human rights standards permit necessary and proportionate restrictions on expression to protect people’s right to privacy. As such, the Board favors narrowing the exceptions to the Privacy Violations policy to help Meta better protect the private residential information of people both on and off its platforms.

In exchanges with the Board, Meta stressed that “ensuring that the “publicly available” definition does not exempt content from removal that poses a risk of offline harm” is a “persistent concern.” Public records and other sources of what could be considered “publicly available” information still require resources and effort to be accessed by the general public. On social media, however, such information may be shared and accessed more quickly, and on a much bigger scale, which significantly increases the risk of harm. As such, the Board proposes removing the “publicly available” exception for the sharing of both private residential addresses and images that meet certain criteria….(More)”.

Artificial Intelligence Bias and Discrimination: Will We Pull the Arc of the Moral Universe Towards Justice?


Paper by Emile Loza de Siles: “In 1968, the Reverend Martin Luther King Jr. foresaw the inevitability of society’s eventual triumph over the deep racism of his time and the stain that continues to cast its destructive oppressive pall today. From the pulpit of the nation’s church, Dr King said, “We shall overcome because the arc of the moral universe is long but it bends toward justice”. More than 40 years later, Eric Holder, the first African American United States Attorney General, agreed, but only if people acting with conviction exert to pull that arc towards justice.

With artificial intelligence (AI) bias and discrimination rampant, the need to pull the moral arc towards algorithmic justice is urgent. This article offers empowering clarity by conceptually bifurcating AI bias problems into AI bias engineering and organisational AI governance problems, revealing proven legal development pathways to protect against the corrosive harms of AI bias and discrimination…(More)”.

We Still Can’t See American Slavery for What It Was


Jamelle Bouie at the New York Times: “…It is thanks to decades of painstaking, difficult work that we know a great deal about the scale of human trafficking across the Atlantic Ocean and about the people aboard each ship. Much of that research is available to the public in the form of the SlaveVoyages database. A detailed repository of information on individual ships, individual voyages and even individual people, it is a groundbreaking tool for scholars of slavery, the slave trade and the Atlantic world. And it continues to grow. Last year, the team behind SlaveVoyages introduced a new data set with information on the domestic slave trade within the United States, titled “Oceans of Kinfolk.”

The systematic effort to quantify the slave trade goes back at least as far as the 19th century…

Because of its specificity with regard to individual enslaved people, this new information is as pathbreaking for lay researchers and genealogists as it is for scholars and historians. It is also, for me, an opportunity to think about the difficult ethical questions that surround this work: How exactly do we relate to data that allows someone — anyone — to identify a specific enslaved person? How do we wield these powerful tools for quantitative analysis without abstracting the human reality away from the story? And what does it mean to study something as wicked and monstrous as the slave trade using some of the tools of the trade itself?…

“The data that we have about those ships is also kind of caught in a stranglehold of ship captains who care about some things and don’t care about others,” Jennifer Morgan said. We know what was important to them. It is the task of the historian to bring other resources to bear on this knowledge, to shed light on what the documents, and the data, might obscure.

“By merely reproducing the metrics of slave traders,” Fuentes said, “you’re not actually providing us with information about the people, the humans, who actually bore the brunt of this violence. And that’s important. It is important to humanize this history, to understand that this happened to African human beings.”

It’s here that we must engage with the question of the public. Work like the SlaveVoyages database exists in the “digital humanities,” a frequently public-facing realm of scholarship and inquiry. And within that context, an important part of respecting the humanity of the enslaved is thinking about their descendants.

“If you’re doing a digital humanities project, it exists in the world,” said Jessica Marie Johnson, an assistant professor of history at Johns Hopkins and the author of “Wicked Flesh: Black Women, Intimacy, and Freedom in the Atlantic World.” “It exists among a public that is beyond the academy and beyond Silicon Valley. And that means that there should be certain other questions that we ask, a different kind of ethics of care and a different morality that we bring to things.”…(More)”.

Commission puts forward declaration on digital rights and principles for everyone in the EU


Press Release: “Today, the Commission is proposing to the European Parliament and Council to sign up to a declaration of rights and principles that will guide the digital transformation in the EU.

graphic showing pyramid with three layers on top democracy then rules and at the base cutting-edge technology
European Commission

The draft declaration on digital rights and principles aims to give everyone a clear reference point about the kind of digital transformation Europe promotes and defends. It will also provide a guide for policy makers and companies when dealing with new technologies. The rights and freedoms enshrined in the EU’s legal framework, and the European values expressed by the principles, should be respected online as they are offline. Once jointly endorsed, the Declaration will also define the approach to the digital transformation which the EU will promote throughout the world…(More)”.

Building machines that work for everyone – how diversity of test subjects is a technology blind spot, and what to do about it


Article by Tahira Reid and James Gibert: “People interact with machines in countless ways every day. In some cases, they actively control a device, like driving a car or using an app on a smartphone. Sometimes people passively interact with a device, like being imaged by an MRI machine. And sometimes they interact with machines without consent or even knowing about the interaction, like being scanned by a law enforcement facial recognition system.

Human-Machine Interaction (HMI) is an umbrella term that describes the ways people interact with machines. HMI is a key aspect of researching, designing and building new technologies, and also studying how people use and are affected by technologies.

Researchers, especially those traditionally trained in engineering, are increasingly taking a human-centered approach when developing systems and devices. This means striving to make technology that works as expected for the people who will use it by taking into account what’s known about the people and by testing the technology with them. But even as engineering researchers increasingly prioritize these considerations, some in the field have a blind spot: diversity.

As an interdisciplinary researcher who thinks holistically about engineering and design and an expert in dynamics and smart materials with interests in policy, we have examined the lack of inclusion in technology design, the negative consequences and possible solutions….

It is possible to use a homogenous sample of people in publishing a research paper that adds to a field’s body of knowledge. And some researchers who conduct studies this way acknowledge the limitations of homogenous study populations. However, when it comes to developing systems that rely on algorithms, such oversights can cause real-world problems. Algorithms are as only as good as the data that is used to build them.

Algorithms are often based on mathematical models that capture patterns and then inform a computer about those patterns to perform a given task. Imagine an algorithm designed to detect when colors appear on a clear surface. If the set of images used to train that algorithm consists of mostly shades of red, the algorithm might not detect when a shade of blue or yellow is present…(More)”.

Why Privacy Matters


Book by Neil Richards: “Many people tell us that privacy is dead, or that it is dying, but such talk is a dangerous fallacy. This book explains what privacy is, what privacy isn’t, and why privacy matters. Privacy is the extent to which human information is known or used, and it is fundamentally about the social power that human information provides over other people. The best way to ensure that power is checked and channeled in ways that benefit humans and their society is through rules—rules about human information. And because human information rules of some sort are inevitable, we should craft our privacy rules to promote human values. The book suggests three such values that our human information rules should promote: identity, freedom, and protection. Identity allows us to be thinking, self-defining humans; freedom lets us be citizens; while protection safeguards our roles as situated consumers and workers, allowing us, as members of society, to trust and rely on other people so that we can live our lives and hopefully build a better future together…(More)”.

‘Sharing Is Caring’: Creative Commons, Transformative Culture, and Moral Rights Protection


Paper by Alexandra Giannopoulou: “The practice of sharing works free from traditional legal reservations, aims to mark both ideological and systemic distance from the exclusive proprietary regime of copyright. The positive involvement of the public in creativity acts is a defining feature of transformative culture in the digital sphere, which encourages creative collaborations between several people, without any limitation in space or time. Moral rights regimes are antithetical to these practices. This chapter will explore the moral rights challenges emerging from transformative culture. We will take the example of Creative Commons licenses and their interaction with internationally recognized moral rights. We conclude that the chilling effects of this legal uncertainty linked to moral rights enforcement could hurt copyright as a whole, but that moral rights can still constitute a strong defence mechanism against modern risks related to digital transformative creativity…(More)”.

Legal study on Government access to data in third countries


Report commissioned by the European Data Protection Board: “The present report is part of a study analysing the implications for the work of the European Union (EU)/ European Economic Area (EEA) data protection supervisory authorities (SAs) in relation to transfers of personal data to third countries after the Court of Justice of the European Union (CJEU) judgment C- 311/18 on Data Protection Commissioner v. Facebook Ireland Ltd, Maximilian Schrems (Schrems II). Data controllers and processors may transfer personal data to third countries or international organisations only if the controller or processor has provided appropriate safeguards, and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.

Whereas it is the primary responsibility of data exporters and data importers to assess that the legislation of the country of destination enables the data importer to comply with any of the appropriate safeguards, SAs will play a key role when issuing further decisions on transfers to third countries. Hence, this report provides the European Data Protection Board (EDPB) and the SAs in the EEA/EU with information on the legislation and practice in China, India and Russia on their governments’ access to personal data processed by economic operators. The report contains an overview of the relevant information in order for the SAs to assess whether and to what extent legislation and practices in the abovementioned countries imply massive and/or indiscriminate access to personal data processed by economic operators…(More)”.

The Tech That Comes Next


Book by Amy Sample Ward and Afua Brice: “Who is part of technology development, who funds that development, and how we put technology to use all influence the outcomes that are possible. To change those outcomes, we must – all of us – shift our relationship to technology, how we use it, build it, fund it, and more. In The Tech That Comes Next, Amy Sample Ward and Afua Bruce – two leaders in equitable design and use of new technologies – invite you to join them in asking big questions and making change from wherever you are today. 

This book connects ideas and conversations across sectors from artificial intelligence to data collection, community centered design to collaborative funding, and social media to digital divides. Technology and equity are inextricably connected, and The Tech That Comes Next helps you accelerate change for the better….(More)