Issue Brief by Andrew Young: “Understanding how and why group data is collected and what can be done to protect children’s rights…While the data protection field largely focuses on individual data harms, it is a focus that obfuscates and exacerbates the risks of data that could put groups of people at risk, such as the residents of a particular village, rather than individuals.
Though not well-represented in the current responsible data literature and policy domains writ large, the challenges group data poses are immense. Moreover, the unique and amplified group data risks facing children are even less scrutinized and understood.
To achieve Responsible Data for Children (RD4C) and ensure effective and legitimate governance of children’s data, government policymakers, data practitioners, and institutional decision makers need to ensure children’s group data are a core consideration in all relevant policies, procedures, and practices….(More)”. (See also Responsible Data for Children).
Maryam Ahmed at BBC News: “Women with darker skin are more than twice as likely to be told their photos fail UK passport rules when they submit them online than lighter-skinned men, according to a BBC investigation.
One black student said she was wrongly told her mouth looked open each time she uploaded five different photos to the government website.
This shows how “systemic racism” can spread, Elaine Owusu said.
The Home Office said the tool helped users get their passports more quickly.
“The indicative check [helps] our customers to submit a photo that is right the first time,” said a spokeswoman.
“Over nine million people have used this service and our systems are improving.
“We will continue to develop and evaluate our systems with the objective of making applying for a passport as simple as possible for all.”
The passport application website uses an automated check to detect poor quality photos which do not meet Home Office rules. These include having a neutral expression, a closed mouth and looking straight at the camera.
BBC research found this check to be less accurate on darker-skinned people.
More than 1,000 photographs of politicians from across the world were fed into the online checker.
The results indicated:
- Dark-skinned women are told their photos are poor quality 22% of the time, while the figure for light-skinned women is 14%
- Dark-skinned men are told their photos are poor quality 15% of the time, while the figure for light-skinned men is 9%
Photos of women with the darkest skin were four times more likely to be graded poor quality, than women with the lightest skin….(More)”.
Article by Eliza McCullough: “….Instead of a smart city model that extracts from, surveils, and displaces poor people of color, we need a democratic model that allows community members to decide how technological infrastructure operates and to ensure the equitable distribution of benefits. Doing so will allow us to create cities defined by inclusion, shared ownership, and shared prosperity.
In 2016, Barcelona, for example, launched its Digital City Plan, which aims to empower residents with control of technology used in their communities. The document incorporates over 8,000 proposals from residents and includes plans for open source software, government ownership of all ICT infrastructure, and a pilot platform to help citizens maintain control over their personal data. As a result, the city now has free applications that allow residents to easily propose city development ideas, actively participate in city council meetings, and choose how their data is shared.
In the U.S., we need a framework for tech sovereignty that incorporates a racial equity approach: In a racist society, race neutrality facilitates continued exclusion and exploitation of people of color. Digital Justice Lab in Toronto illustrates one critical element of this kind of approach: access to information. In 2018, the organization gave community groups a series of grants to hold public events that shared resources and information about digital rights. Their collaborative approach intentionally focuses on the specific needs of people of color and other marginalized groups.
The turn toward intensified surveillance infrastructure in the midst of the coronavirus outbreak makes the need to adopt such practices all the more crucial. Democratic tech models that uplift marginalized populations provide us the chance to build a city that is just and open to everyone….(More)”.
Book edited by Maggie Walter, Tahu Kukutai, Stephanie Russo Carroll and Desi Rodriguez-Lonebear: “This book examines how Indigenous Peoples around the world are demanding greater data sovereignty, and challenging the ways in which governments have historically used Indigenous data to develop policies and programs.
In the digital age, governments are increasingly dependent on data and data analytics to inform their policies and decision-making. However, Indigenous Peoples have often been the unwilling targets of policy interventions and have had little say over the collection, use and application of data about them, their lands and cultures. At the heart of Indigenous Peoples’ demands for change are the enduring aspirations of self-determination over their institutions, resources, knowledge and information systems.
With contributors from Australia, Aotearoa New Zealand, North and South America and Europe, this book offers a rich account of the potential for Indigenous data sovereignty to support human flourishing and to protect against the ever-growing threats of data-related risks and harms….(More)”.
Jane Bailey et al at The Conversation: “…In his game-changing 1993 book, The Panoptic Sort, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like banking and employment.
Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.
Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams.
This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.
The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the structural violence perpetrated through facial recognition technology and other digital technologies that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns.
Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups.
Predictive policing uses algorithmic processing of historical data to predict when and where new crimes are likely to occur, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment….(More)”.
Open access book by Christoph Bartneck, Christoph Lütge, Alan Wagner and Sean Welsh: “This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities.
This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities….(More)”.
Essay by Stefania Milan and Emiliano Treré: “Quantification is central to the narration of the COVID-19 pandemic. Numbers determine the existence of the problem and affect our ability to care and contribute to relief efforts. Yet many communities at the margins, including many areas of the Global South, are virtually absent from this number-based narration of the pandemic. This essay builds on critical data studies to warn against the universalization of problems, narratives, and responses to the virus. To this end, it explores two types of data gaps and the corresponding “data poor.” The first gap concerns the data poverty perduring in low-income countries and jeopardizing their ability to adequately respond to the pandemic. The second affects vulnerable populations within a variety of geopolitical and socio-political contexts, whereby data poverty constitutes a dangerous form of invisibility which perpetuates various forms of inequality. But, even during the pandemic, the disempowered manage to create innovative forms of solidarity from below that partially mitigate the negative effects of their invisibility….(More)”.
Book edited by edited by Linnet Taylor, Aaron Martin, Gargi Sharma and Shazade Jameson: “In early 2020, as the COVID-19 pandemic swept the world and states of emergency were declared by one country after another, the global technology sector—already equipped with unprecedented wealth, power, and influence—mobilised to seize the opportunity. This collection is an account of what happened next and captures the emergent conflicts and responses around the world. The essays provide a global perspective on the implications of these developments for justice: they make it possible to compare how the intersection of state and corporate power—and the way that power is targeted and exercised—confronts, and invites resistance from, civil society in countries worldwide.
This edited volume captures the technological response to the pandemic in 33 countries, accompanied by nine thematic reflections, and reflects the unfolding of the first wave of the pandemic.
This book can be read as a guide to the landscape of technologies deployed during the pandemic and also be used to compare individual country strategies. It will prove useful as a tool for teaching and learning in various disciplines and as a reference point for activists and analysts interested in issues of data justice.
The essays interrogate these technologies and the political, legal, and regulatory structures that determine how they are applied. In doing so,the book exposes the workings of state technological power to critical assessment and contestation….(More)”
Paper by David Farrell et al: “Ireland’s Citizens’ Assembly (CA) of 2016–18 was tasked with making recommendations on abortion. This paper shows that from the outset its members were in large part in favour of the liberalisation of abortion (though a fair proportion were undecided), that over the course of its deliberations the CA as a whole moved in a more liberal direction on the issue, but that its position was largely reflected in the subsequent referendum vote by the population as a whole….(More)”
Reboot’s “Design With” podcast with Antionette Carroll: “What began as a 24-hour design challenge addressing racial inequality in Ferguson, MO has since grown into a powerful organization fighting inequity with its own brand of collaborative design. Antionette Carroll, founder of Creative Reaction Lab, speaks about Equity-Centered Community Design—and how Black and Latinx youth are using design as their tool of choice to dismantle the very systems designed to exclude them….(More)”.