Indigenous Data Sovereignty and Policy


Book edited by Maggie WalterTahu KukutaiStephanie Russo Carroll and Desi Rodriguez-Lonebear: “This book examines how Indigenous Peoples around the world are demanding greater data sovereignty, and challenging the ways in which governments have historically used Indigenous data to develop policies and programs.

In the digital age, governments are increasingly dependent on data and data analytics to inform their policies and decision-making. However, Indigenous Peoples have often been the unwilling targets of policy interventions and have had little say over the collection, use and application of data about them, their lands and cultures. At the heart of Indigenous Peoples’ demands for change are the enduring aspirations of self-determination over their institutions, resources, knowledge and information systems.

With contributors from Australia, Aotearoa New Zealand, North and South America and Europe, this book offers a rich account of the potential for Indigenous data sovereignty to support human flourishing and to protect against the ever-growing threats of data-related risks and harms….(More)”.

AI technologies — like police facial recognition — discriminate against people of colour


Jane Bailey et al at The Conversation: “…In his game-changing 1993 book, The Panoptic Sort, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like banking and employment.

Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.

Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams.

Pre-existing bias

This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.

The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the structural violence perpetrated through facial recognition technology and other digital technologies that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns.

Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups.

Predictive policing uses algorithmic processing of historical data to predict when and where new crimes are likely to occur, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment….(More)”.

An Introduction to Ethics in Robotics and AI


Open access book by Christoph Bartneck, Christoph Lütge, Alan Wagner and Sean Welsh: “This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities.

This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities….(More)”.

The Rise of the Data Poor: The COVID-19 Pandemic Seen From the Margins


Essay by Stefania Milan and Emiliano Treré: “Quantification is central to the narration of the COVID-19 pandemic. Numbers determine the existence of the problem and affect our ability to care and contribute to relief efforts. Yet many communities at the margins, including many areas of the Global South, are virtually absent from this number-based narration of the pandemic. This essay builds on critical data studies to warn against the universalization of problems, narratives, and responses to the virus. To this end, it explores two types of data gaps and the corresponding “data poor.” The first gap concerns the data poverty perduring in low-income countries and jeopardizing their ability to adequately respond to the pandemic. The second affects vulnerable populations within a variety of geopolitical and socio-political contexts, whereby data poverty constitutes a dangerous form of invisibility which perpetuates various forms of inequality. But, even during the pandemic, the disempowered manage to create innovative forms of solidarity from below that partially mitigate the negative effects of their invisibility….(More)”.

Data Justice and COVID-19: Global Perspectives


Book edited by edited by Linnet Taylor, Aaron Martin, Gargi Sharma and Shazade Jameson: “In early 2020, as the COVID-19 pandemic swept the world and states of emergency were declared by one country after another, the global technology sector—already equipped with unprecedented wealth, power, and influence—mobilised to seize the opportunity. This collection is an account of what happened next and captures the emergent conflicts and responses around the world. The essays provide a global perspective on the implications of these developments for justice: they make it possible to compare how the intersection of state and corporate power—and the way that power is targeted and exercised—confronts, and invites resistance from, civil society in countries worldwide.

This edited volume captures the technological response to the pandemic in 33 countries, accompanied by nine thematic reflections, and reflects the unfolding of the first wave of the pandemic.

This book can be read as a guide to the landscape of technologies deployed during the pandemic and also be used to compare individual country strategies. It will prove useful as a tool for teaching and learning in various disciplines and as a reference point for activists and analysts interested in issues of data justice.

The essays interrogate these technologies and the political, legal, and regulatory structures that determine how they are applied. In doing so,the book exposes the workings of state technological power to critical assessment and contestation….(More)”

When Mini-Publics and Maxi-Publics Coincide: Ireland’s National Debate on Abortion


Paper by David Farrell et al: “Ireland’s Citizens’ Assembly (CA) of 2016–18 was tasked with making recommendations on abortion. This paper shows that from the outset its members were in large part in favour of the liberalisation of abortion (though a fair proportion were undecided), that over the course of its deliberations the CA as a whole moved in a more liberal direction on the issue, but that its position was largely reflected in the subsequent referendum vote by the population as a whole….(More)”

Can You Fight Systemic Racism With Design?


Reboot’s “Design With” podcast with Antionette Carroll: “What began as a 24-hour design challenge addressing racial inequality in Ferguson, MO has since grown into a powerful organization fighting inequity with its own brand of collaborative design. Antionette Carroll, founder of Creative Reaction Lab, speaks about Equity-Centered Community Design—and how Black and Latinx youth are using design as their tool of choice to dismantle the very systems designed to exclude them….(More)”.

Interventions to mitigate the racially discriminatory impacts of emerging tech including AI


Joint Civil Society Statement: “As widespread recent protests have highlighted, racial inequality remains an urgent and devastating issue around the world, and this is as true in the context of technology as it is everywhere else. In fact, it may be more so, as algorithmic technologies based on big data are deployed at previously unimaginable scale, reproducing the discriminatory systems that build and govern them.

The undersigned organizations welcome the publication of the report “Racial discrimination and emerging digital technologies: a human rights analysis,” by Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, E. Tendayi Achiume, and wish to underscore the importance and timeliness of a number of the recommendations made therein:

  1. Technologies that have had or will have significant racially discriminatory impacts should be banned outright.
    While incremental regulatory approaches may be appropriate in some contexts, where a technology is demonstrably likely to cause racially discriminatory harm, it should not be deployed until that harm can be prevented. Moreover, certain technologies may always have disparate racial impacts, no matter how much their accuracy can be improved. In the present moment, racially discriminatory technologies include facial and affect recognition technology and so-called predictive analytics. We support Special Rapporteur Achiume’s call for mandatory human rights impact assessments as a prerequisite for the adoption of new technologies. We also believe that where such assessments reveal that a technology has a high likelihood of deleterious racially disparate impacts, states should prevent its use through a ban or moratorium. We join the Special Rapporteur in welcoming recent municipal bans, for example, on the use of facial recognition technology, and encourage national governments to adopt similar policies.  Correspondingly, we reiterate our support for states’ imposition of an immediate moratorium on the trade and use of privately developed surveillance tools until such time as states enact appropriate safeguards, and congratulate Special Rapporteur Achiume on joining that call.
  2. Gender mainstreaming and representation along racial, national and other intersecting identities requires radical improvement at all levels of the tech sector.
  3. Technologists cannot solve political, social, and economic problems without the input of domain experts and those personally impacted.
  4. Access to technology is as urgent an issue of racial discrimination as inequity in the design of technologies themselves.
  5. Representative and disaggregated data is a necessary, if not sufficient, condition for racial equity in emerging digital technologies, but it must be collected and managed equitably as well.
  6. States as well as corporations must provide remedies for racial discrimination, including reparations.… (More)”.

Race and America: why data matters


Federica Cocco and Alan Smith at the Financial Times: “… To understand the historical roots of black data activism, we have to return to October 1899. Back then, Thomas Calloway, a clerk in the War Department, wrote to the educator Booker T Washington about his pitch for an “American Negro Exhibit” at the 1900 Exposition Universelle in Paris. It was right in the middle of the scramble for Africa and Europeans had developed a morbid fascination with the people they were trying to subjugate.

To Calloway, the Paris exhibition offered a unique venue to sway the global elite to acknowledge “the possibilities of the Negro” and to influence cultural change in the US from an international platform.

It is hard to overstate the importance of international fairs at the time. They were a platform to bolster the prestige of nations. In Delivering Views: Distant Cultures in Early Postcards, Robert Rydell writes that fairs had become “a vehicle that, perhaps next to the church, had the greatest capacity to influence a mass audience”….

For the Paris World Fair, Du Bois and a team of Atlanta University students and alumni designed and drew by hand more than 60 bold data portraits. A first set used Georgia as a case study to illustrate the progress made by African Americans since the Civil War.

A second set showed how “the descendants of former African slaves now in residence in the United States of America” had become lawyers, doctors, inventors and musicians. For the first time, the growth of literacy and employment rates, the value of assets and land owned by African Americans and their growing consumer power were there for everyone to see. At the 1900 World Fair, the “Exhibit of American Negroes” took up a prominent spot in the Palace of Social Economy. “As soon as they entered the building, visitors were inundated by examples of black excellence,” says Whitney Battle-Baptiste, director of the WEB Du Bois Center at the University of Massachusetts Amherst and co-author of WEB Du Bois’s Data Portraits: Visualizing Black America….(More)”

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’

Working with students and alumni from Atlanta University, Du Bois created 60 bold data portraits for the ‘Exhibit of American Negroes’ © Library of Congress, Prints & Photographs Division

#HumanRights: The Technologies and Politics of Justice Claims in Practice


Book by Ronald Niezen: “Social justice and human rights movements are entering a new phase. Social media, artificial intelligence, and digital forensics are reshaping advocacy and compliance. Technicians, lawmakers, and advocates, sometimes in collaboration with the private sector, have increasingly gravitated toward the possibilities and dangers inherent in the nonhuman. #HumanRights examines how new technologies interact with older models of rights claiming and communication, influencing and reshaping the modern-day pursuit of justice.

Ronald Niezen argues that the impacts of information technologies on human rights are not found through an exclusive focus on sophisticated, expert-driven forms of data management but in considering how these technologies are interacting with other, “traditional” forms of media to produce new avenues of expression, public sympathy, redress of grievances, and sources of the self. Niezen considers various ways that the pursuit of justice is happening via new technologies, including crowdsourcing, social media–facilitated mobilizations (and enclosures), WhatsApp activist networks, and the selective attention of Google’s search engine algorithm. He uncovers how emerging technologies of data management and social media influence the ways that human rights claimants and their allies pursue justice, and the “new victimology” that prioritizes and represents strategic lives and types of violence over others. #HumanRights paints a striking and important panoramic picture of the contest between authoritarianism and the new tools by which people attempt to leverage human rights and bring the powerful to account….(More)”.