Data as Property?


Blog by Salomé Viljoen: “Since the proliferation of the World Wide Web in the 1990s, critics of widely used internet communications services have warned of the misuse of personal data. Alongside familiar concerns regarding user privacy and state surveillance, a now-decades-long thread connects a group of theorists who view data—and in particular data about people—as central to what they have termed informational capitalism.1 Critics locate in datafication—the transformation of information into commodity—a particular economic process of value creation that demarcates informational capitalism from its predecessors. Whether these critics take “information” or “capitalism” as the modifier warranting primary concern, datafication, in their analysis, serves a dual role: both a process of production and a form of injustice.

In arguments levied against informational capitalism, the creation, collection, and use of data feature prominently as an unjust way to order productive activity. For instance, in her 2019 blockbuster The Age of Surveillance Capitalism, Shoshanna Zuboff likens our inner lives to a pre-Colonial continent, invaded and strip-mined of data by technology companies seeking profits.2 Elsewhere, Jathan Sadowski identifies data as a distinct form of capital, and accordingly links the imperative to collect data to the perpetual cycle of capital accumulation.3 Julie Cohen, in the Polanyian tradition, traces the “quasi-ownership through enclosure” of data and identifies the processing of personal information in “data refineries” as a fourth factor of production under informational capitalism.4

Critiques breed proposals for reform. Thus, data governance emerges as key terrain on which to discipline firms engaged in datafication and to respond to the injustices of informational capitalism. Scholars, activists, technologists and even presidential candidates have all proposed data governance reforms to address the social ills generated by the technology industry.

These reforms generally come in two varieties. Propertarian reforms diagnose the source of datafication’s injustice in the absence of formal property (or alternatively, labor) rights regulating the process of production. In 2016, inventor of the world wide web Sir Tim Berners-Lee founded Solid, a web decentralization platform, out of his concern over how data extraction fuels the growing power imbalance of the web which, he notes, “has evolved into an engine of inequity and division; swayed by powerful forces who use it for their own agendas.” In response, Solid “aims to radically change the way Web applications work today, resulting in true data ownership as well as improved privacy.” Solid is one popular project within the blockchain community’s #ownyourdata movement; another is Radical Markets, a suite of proposals from Glen Weyl (an economist and researcher at Microsoft) that includes developing a labor market for data. Like Solid, Weyl’s project is in part a response to inequality: it aims to disrupt the digital economy’s “technofeudalism,” where the unremunerated fruits of data laborers’ toil help drive the inequality of the technology economy writ large.5 Progressive politicians from Andrew Yang to Alexandria Ocasio-Cortez have similarly advanced proposals to reform the information economy, proposing variations on the theme of user-ownership over their personal data.

The second type of reforms, which I call dignitarian, take a further step beyond asserting rights to data-as-property, and resist data’s commodification altogether, drawing on a framework of civil and human rights to advocate for increased protections. Proposed reforms along these lines grant individuals meaningful capacity to say no to forms of data collection they disagree with, to determine the fate of data collected about them, and to grant them rights against data about them being used in ways that violate their interests….(More)”.

Civil Liberties in Times of Crisis


Paper by Marcella Alsan, Luca Braghieri, Sarah Eichmeyer, Minjeong Joyce Kim, Stefanie Stantcheva, and David Y. Yang: “The respect for and protection of civil liberties are one of the fundamental roles of the state, and many consider civil liberties as sacred and “nontradable.” Using cross-country representative surveys that cover 15 countries and over 370,000 respondents, we study whether and the extent to which citizens are willing to trade off civil liberties during the COVID-19 pandemic, one of the largest crises in recent history. We find four main results. First, many around the world reveal a clear willingness to trade off civil liberties for improved public health conditions. Second, consistent across countries, exposure to health risks is associated with citizens’ greater willingness to trade off civil liberties, though individuals who are more economically disadvantaged are less willing to do so. Third, attitudes concerning such trade-offs are elastic to information. Fourth, we document a gradual decline and then plateau in citizens’ overall willingness to sacrifice rights and freedom as the pandemic progresses, though the underlying correlation between individuals’ worry about health and their attitudes over the trade-offs has been remarkably constant. Our results suggest that citizens do not view civil liberties as sacred values; rather, they are willing to trade off civil liberties more or less readily, at least in the short-run, depending on their own circumstances and information….(More)”.

Responsible group data for children


Issue Brief by Andrew Young: “Understanding how and why group data is collected and what can be done to protect children’s rights…While the data protection field largely focuses on individual data harms, it is a focus that obfuscates and exacerbates the risks of data that could put groups of people at risk, such as the residents of a particular village, rather than individuals.

Though not well-represented in the current responsible data literature and policy domains writ large, the challenges group data poses are immense. Moreover, the unique and amplified group data risks facing children are even less scrutinized and understood.

To achieve Responsible Data for Children (RD4C) and ensure effective and legitimate governance of children’s data, government policymakers, data practitioners, and institutional decision makers need to ensure children’s group data are a core consideration in all relevant policies, procedures, and practices….(More)”. (See also Responsible Data for Children).

UK passport photo checker shows bias against dark-skinned women


Maryam Ahmed at BBC News: “Women with darker skin are more than twice as likely to be told their photos fail UK passport rules when they submit them online than lighter-skinned men, according to a BBC investigation.

One black student said she was wrongly told her mouth looked open each time she uploaded five different photos to the government website.

This shows how “systemic racism” can spread, Elaine Owusu said.

The Home Office said the tool helped users get their passports more quickly.

“The indicative check [helps] our customers to submit a photo that is right the first time,” said a spokeswoman.

“Over nine million people have used this service and our systems are improving.

“We will continue to develop and evaluate our systems with the objective of making applying for a passport as simple as possible for all.”

Skin colour

The passport application website uses an automated check to detect poor quality photos which do not meet Home Office rules. These include having a neutral expression, a closed mouth and looking straight at the camera.

BBC research found this check to be less accurate on darker-skinned people.

More than 1,000 photographs of politicians from across the world were fed into the online checker.

The results indicated:

  • Dark-skinned women are told their photos are poor quality 22% of the time, while the figure for light-skinned women is 14%
  • Dark-skinned men are told their photos are poor quality 15% of the time, while the figure for light-skinned men is 9%

Photos of women with the darkest skin were four times more likely to be graded poor quality, than women with the lightest skin….(More)”.

The secret to building a smart city that’s antiracist


Article by Eliza McCullough: “….Instead of a smart city model that extracts from, surveils, and displaces poor people of color, we need a democratic model that allows community members to decide how technological infrastructure operates and to ensure the equitable distribution of benefits. Doing so will allow us to create cities defined by inclusion, shared ownership, and shared prosperity.

In 2016, Barcelona, for example, launched its Digital City Plan, which aims to empower residents with control of technology used in their communities. The document incorporates over 8,000 proposals from residents and includes plans for open source software, government ownership of all ICT infrastructure, and a pilot platform to help citizens maintain control over their personal data. As a result, the city now has free applications that allow residents to easily propose city development ideas, actively participate in city council meetings, and choose how their data is shared.

In the U.S., we need a framework for tech sovereignty that incorporates a racial equity approach: In a racist society, race neutrality facilitates continued exclusion and exploitation of people of color. Digital Justice Lab in Toronto illustrates one critical element of this kind of approach: access to information. In 2018, the organization gave community groups a series of grants to hold public events that shared resources and information about digital rights. Their collaborative approach intentionally focuses on the specific needs of people of color and other marginalized groups.

The turn toward intensified surveillance infrastructure in the midst of the coronavirus outbreak makes the need to adopt such practices all the more crucial. Democratic tech models that uplift marginalized populations provide us the chance to build a city that is just and open to everyone….(More)”.

Indigenous Data Sovereignty and Policy


Book edited by Maggie WalterTahu KukutaiStephanie Russo Carroll and Desi Rodriguez-Lonebear: “This book examines how Indigenous Peoples around the world are demanding greater data sovereignty, and challenging the ways in which governments have historically used Indigenous data to develop policies and programs.

In the digital age, governments are increasingly dependent on data and data analytics to inform their policies and decision-making. However, Indigenous Peoples have often been the unwilling targets of policy interventions and have had little say over the collection, use and application of data about them, their lands and cultures. At the heart of Indigenous Peoples’ demands for change are the enduring aspirations of self-determination over their institutions, resources, knowledge and information systems.

With contributors from Australia, Aotearoa New Zealand, North and South America and Europe, this book offers a rich account of the potential for Indigenous data sovereignty to support human flourishing and to protect against the ever-growing threats of data-related risks and harms….(More)”.

AI technologies — like police facial recognition — discriminate against people of colour


Jane Bailey et al at The Conversation: “…In his game-changing 1993 book, The Panoptic Sort, scholar Oscar Gandy warned that “complex technology [that] involves the collection, processing and sharing of information about individuals and groups that is generated through their daily lives … is used to coordinate and control their access to the goods and services that define life in the modern capitalist economy.” Law enforcement uses it to pluck suspects from the general public, and private organizations use it to determine whether we have access to things like banking and employment.

Gandy prophetically warned that, if left unchecked, this form of “cybernetic triage” would exponentially disadvantage members of equality-seeking communities — for example, groups that are racialized or socio-economically disadvantaged — both in terms of what would be allocated to them and how they might come to understand themselves.

Some 25 years later, we’re now living with the panoptic sort on steroids. And examples of its negative effects on equality-seeking communities abound, such as the false identification of Williams.

Pre-existing bias

This sorting using algorithms infiltrates the most fundamental aspects of everyday life, occasioning both direct and structural violence in its wake.

The direct violence experienced by Williams is immediately evident in the events surrounding his arrest and detention, and the individual harms he experienced are obvious and can be traced to the actions of police who chose to rely on the technology’s “match” to make an arrest. More insidious is the structural violence perpetrated through facial recognition technology and other digital technologies that rate, match, categorize and sort individuals in ways that magnify pre-existing discriminatory patterns.

Structural violence harms are less obvious and less direct, and cause injury to equality-seeking groups through systematic denial to power, resources and opportunity. Simultaneously, it increases direct risk and harm to individual members of those groups.

Predictive policing uses algorithmic processing of historical data to predict when and where new crimes are likely to occur, assigns police resources accordingly and embeds enhanced police surveillance into communities, usually in lower-income and racialized neighbourhoods. This increases the chances that any criminal activity — including less serious criminal activity that might otherwise prompt no police response — will be detected and punished, ultimately limiting the life chances of the people who live within that environment….(More)”.

An Introduction to Ethics in Robotics and AI


Open access book by Christoph Bartneck, Christoph Lütge, Alan Wagner and Sean Welsh: “This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities.

This book provides an introduction into the ethics of robots and artificial intelligence. The book was written with university students, policy makers, and professionals in mind but should be accessible for most adults. The book is meant to provide balanced and, at times, conflicting viewpoints as to the benefits and deficits of AI through the lens of ethics. As discussed in the chapters that follow, ethical questions are often not cut and dry. Nations, communities, and individuals may have unique and important perspectives on these topics that should be heard and considered. While the voices that compose this book are our own, we have attempted to represent the views of the broader AI, robotics, and ethics communities….(More)”.

The Rise of the Data Poor: The COVID-19 Pandemic Seen From the Margins


Essay by Stefania Milan and Emiliano Treré: “Quantification is central to the narration of the COVID-19 pandemic. Numbers determine the existence of the problem and affect our ability to care and contribute to relief efforts. Yet many communities at the margins, including many areas of the Global South, are virtually absent from this number-based narration of the pandemic. This essay builds on critical data studies to warn against the universalization of problems, narratives, and responses to the virus. To this end, it explores two types of data gaps and the corresponding “data poor.” The first gap concerns the data poverty perduring in low-income countries and jeopardizing their ability to adequately respond to the pandemic. The second affects vulnerable populations within a variety of geopolitical and socio-political contexts, whereby data poverty constitutes a dangerous form of invisibility which perpetuates various forms of inequality. But, even during the pandemic, the disempowered manage to create innovative forms of solidarity from below that partially mitigate the negative effects of their invisibility….(More)”.

Data Justice and COVID-19: Global Perspectives


Book edited by edited by Linnet Taylor, Aaron Martin, Gargi Sharma and Shazade Jameson: “In early 2020, as the COVID-19 pandemic swept the world and states of emergency were declared by one country after another, the global technology sector—already equipped with unprecedented wealth, power, and influence—mobilised to seize the opportunity. This collection is an account of what happened next and captures the emergent conflicts and responses around the world. The essays provide a global perspective on the implications of these developments for justice: they make it possible to compare how the intersection of state and corporate power—and the way that power is targeted and exercised—confronts, and invites resistance from, civil society in countries worldwide.

This edited volume captures the technological response to the pandemic in 33 countries, accompanied by nine thematic reflections, and reflects the unfolding of the first wave of the pandemic.

This book can be read as a guide to the landscape of technologies deployed during the pandemic and also be used to compare individual country strategies. It will prove useful as a tool for teaching and learning in various disciplines and as a reference point for activists and analysts interested in issues of data justice.

The essays interrogate these technologies and the political, legal, and regulatory structures that determine how they are applied. In doing so,the book exposes the workings of state technological power to critical assessment and contestation….(More)”