Legislating Data Loyalty


Paper by Woodrow Hartzog and NNeil M. Richards: “eil M. RichardsLawmakers looking to embolden privacy law have begun to consider imposing duties of loyalty on organizations trusted with people’s data and online experiences. The idea behind loyalty is simple: organizations should not process data or design technologies that conflict with the best interests of trusting parties. But the logistics and implementation of data loyalty need to be developed if the concept is going to be capable of moving privacy law beyond its “notice and consent” roots to confront people’s vulnerabilities in their relationship with powerful data collectors.

In this short Essay, we propose a model for legislating data loyalty. Our model takes advantage of loyalty’s strengths—it is well-established in our law, it is flexible, and it can accommodate conflicting values. Our Essay also explains how data loyalty can embolden our existing data privacy rules, address emergent dangers, solve privacy’s problems around consent and harm, and establish an antibetrayal ethos as America’s privacy identity.

We propose that lawmakers use a two-step process to (1) articulate a primary, general duty of loyalty, then (2) articulate “subsidiary” duties that are more specific and sensitive to context. Subsidiary duties regarding collection, personalization, gatekeeping, persuasion, and mediation would target the most opportunistic contexts for self-dealing and result in flexible open-ended duties combined with highly specific rules. In this way, a duty of data loyalty is not just appealing in theory—it can be effectively implemented in practice just like the other duties of loyalty our law has recognized for hundreds of years. Loyalty is thus not only flexible, but it is capable of breathing life into America’s historically tepid privacy frameworks…(More)”.

Efficient and stable data-sharing in a public transit oligopoly as a coopetitive game


Paper by Qi Liu and Joseph Y.J. Chow: “In this study, various forms of data sharing are axiomatized. A new way of studying coopetition, especially data-sharing coopetition, is proposed. The problem of the Bayesian game with signal dependence on actions is observed; and a method to handle such dependence is proposed. We focus on fixed-route transit service markets. A discrete model is first presented to analyze the data-sharing coopetition of an oligopolistic transit market when an externality effect exists. Given a fixed data sharing structure, a Bayesian game is used to capture the competition under uncertainty while a coalition formation model is used to determine the stable data-sharing decisions. A new method of composite coalition is proposed to study efficient markets. An alternative continuous model is proposed to handle large networks using simulation. We apply these models to various types of networks. Test results show that perfect information may lead to perfect selfishness. Sharing more data does not necessarily improve transit service for all groups, at least if transit operators remain non-cooperative. Service complementarity does not necessarily guarantee a grand data-sharing coalition. These results can provide insights on policy-making, like whether city authorities should enforce compulsory data-sharing along with cooperation between operators or setup a voluntary data-sharing platform…(More)”.

Responsible Data for Children Goes Polyglot: New Translations of Principles & Resources Available


Responsible Data for Children Blog: “In 2018, UNICEF and The GovLab launched the Responsible Data for Children (RD4C) initiative with the aim of supporting organisations and practitioners in ensuring that the interest of children is put at the centre of any work involving data for and about them.

Since its inception, the RD4C initiative has aimed to be field-oriented, driven by the needs of both children and practitioners across sectors and contexts. It has done so by ensuring that actors from the data responsibility sphere are informed and engaged on the RD4C work.

We want them to know what responsible data for and about children entails, why it is important, and how they can realize it in their own work.

In this spirit, the RD4C initiative has started translating its resources into different languages. We would like anyone willing to enhance their responsible data handling practices for and about children to be equipped with resources they can understand. As a global effort, we want to guarantee anyone willing to share their expertise and contribute be given the opportunity to do it.

Importantly, we would like children around the world—including the most marginalised and vulnerable groups—to be aware of what they can expect from organisations handling data for and about them and to have the means to demand and enforce their rights.

Last month, we released the RD4C Video, which is now available in ArabicFrench and Spanish. Soon, the rest of the RD4C resources, such as our principlestools and case studies will be translated as well.”

Democracy Disrupted: Governance in an Increasingly Virtual and Massively Distributed World.


Essay by Eric B. Schnurer: “…In short, it is often difficult to see where new technologies actually will lead. The same technological development can, in different settings, have different effects: The use of horses in warfare, which led seemingly inexorably in China and Europe to more centralized and autocratic states, had the effect on the other side of the world of enabling Hernán Cortés, with an army of roughly five hundred Spaniards, to defeat the massed infantries of the highly centralized, autocratic Aztec regime. Cortés’s example demonstrates that a particular technology generally employed by a concentrated power to centralize and dominate can also be used by a small insurgent force to disperse and disrupt (although in Cortés’s case this was on behalf of the eventual imposition of an even more despotic rule).

Regardless of the lack of inherent ideological content in any given technology, however, our technological realities consistently give metaphorical shape to our ideological constructs. In ancient Egypt, the regularity of the Nile’s flood cycle, which formed the society’s economic basis, gave rise to a belief in recurrent cycles of life and death; in contrast, the comparatively harsh and static agricultural patterns of the more-or-less contemporaneous Mesopotamian world produced a society that conceived of gods who simply tormented humans and then relegated them after death to sit forever in a place of dust and silence; meanwhile, the pastoral societies of the Fertile Crescent have handed down to us the vision of God as shepherd of his flock. (The Bible also gives us, in the story of Cain and Abel, a parable of the deadly conflict that technologically driven economic changes wreak: Abel was a traditional pastoralist—he tended sheep—while Cain, who planted seeds in the ground, represented the disruptive “New Economy” of settled agriculture. Tellingly, after killing off the pastoralist, the sedentarian Cain exits to found the first city.88xGenesis 4:17.)

As humans developed more advanced technologies, these in turn reshaped our conceptions of the world around us, including the proper social order. Those who possessed superior technological knowledge were invested with supernatural authority: The key to early Rome’s defense was the ability quickly to assemble and disassemble the bridges across the Tiber, so much so that the pontifex maximus—literally the “greatest bridge-builder”—became the high priest, from whose Latin title we derive the term pontiff. The most sophisticated—and arguably most crucial—technology in any town in medieval Europe was its public clock. The clock, in turn, became a metaphor for the mechanical working of the universe—God, in fact, was often conceived of as a clockmaker (a metaphor still frequently invoked to argue against evolution and for the necessity of an intelligent creator)—and for the proper form of social organization: All should know their place and move through time and space as predictably as the figurines making their regular appearances and performing their routinized interactions on the more elaborate and entertaining of these town-square timepieces.

In our own time, the leading technologies continue to provide the organizing concepts for our economic, political, and theological constructs. The factory became such a ubiquitous reflection of economic and social realities that, from the early nineteenth century onward, virtually every social and cultural institution—welfare (the poorhouse, or, as it was often called, the “workhouse”), public safety (the penitentiary), health care (the hospital), mental health (the insane asylum), “workforce” or public housing, even (as teachers often suggest to me) the education system—was consciously remodeled around it. Even when government finally tried to get ahead of the challenges posed by the Industrial Revolution by building the twentieth-century welfare state, it wound up constructing essentially a new capital of the Industrial Age in Washington, DC, with countless New Deal ministries along the Mall—resembling, as much as anything, the rows of factory buildings one can see in the steel and mill towns of the same era.

By the middle of the twentieth century, the atom and the computer came to dominate most intellectual constructs. First, the uncertainty of quantum mechanics upended mechanistic conceptions of social and economic relations, helping to foster conceptions of relativism in everything from moral philosophy to literary criticism. More recently, many scientists have come to the conclusion that the universe amounts to a massive information processor, and popular culture to the conviction that we all simply live inside a giant video game.

In sum, while technological developments are not deterministic—their outcomes being shaped, rather, by the uses we conceive to employ them—our conceptions are largely molded by these dominant technologies and the transformations they effect.99xI should note that while this argument is not deterministic, like those of most current thinkers about political and economic development such as Francis Fukuyama, Jared Diamond, and Yuval Noah Harari, neither is it materialistic, like that of Karl Marx. Marx thoroughly rejected human ideas and thinking as movers of history, which he saw as simply shaped and dictated by the technology. I am suggesting instead a dialectic between the ideal and the material. To repeat the metaphor, technological change constitutes the plate tectonics on which human contingencies are then built. To understand, then, the deeper movements of thought, economic arrangements, and political developments, both historical and contemporary, one must understand the nature of the technologies underlying and driving their unfolding…(More)“.

Social Media, Freedom of Speech, and the Future of our Democracy


Book edited by Lee C. Bollinger and Geoffrey R. Stone: “One of the most fiercely debated issues of this era is what to do about “bad” speech-hate speech, disinformation and propaganda campaigns, and incitement of violence-on the internet, and in particular speech on social media platforms such as Facebook and Twitter. In Social Media, Freedom of Speech, and the Future of our Democracy, Lee C. Bollinger and Geoffrey R. Stone have gathered an eminent cast of contributors—including Hillary Clinton, Amy Klobuchar, Sheldon Whitehouse, Mark Warner, Newt Minow,Tim Wu, Cass Sunstein, Jack Balkin, Emily Bazelon, and others—to explore the various dimensions of this problem in the American context. They stress how difficult it is to develop remedies given that some of these forms of “bad” speech are ordinarily protected by the First Amendment. Bollinger and Stone argue that it is important to remember that the last time we encountered major new communications technology-television and radio-we established a federal agency to provide oversight and to issue regulations to protect and promote “the public interest.” Featuring a variety of perspectives from some of America’s leading experts on this hotly contested issue, this volume offers new insights for the future of free speech in the social media era…(More)”.

Roe’s overturn is tech’s privacy apocalypse


Scott Rosenberg at Axios: “America’s new abortion reality is turning tech firms’ data practices into an active field of conflict — a fight that privacy advocates have long predicted and company leaders have long feared.

Why it matters: A long legal siege in which abortion-banning states battle tech companies, abortion-friendly states and their own citizens to gather criminal evidence is now a near certainty.

  • The once-abstract privacy argument among policy experts has transformed overnight into a concrete real-world problem, superheated by partisan anger, affecting vast swaths of the U.S. population, with tangible and easily understood consequences.

Driving the news: Google announced Friday a new program to automatically delete the location data of users who visit “particularly personal” locations like “counseling centers, domestic violence shelters, abortion clinics, fertility centers, addiction treatment facilities, weight loss clinics, cosmetic surgery clinics, and others.”

  • Google tracks the location of any user who turns on its “location services” — a choice that’s required to make many of its programs, like Google Search and Maps, more useful.
  • That tracking happens even when you’re logged into non-location-related Google services like YouTube, since Google long ago unified all its accounts.

Between the lines: Google’s move won cautious applause but left plenty of open concerns.

  • It’s not clear how, and how reliably, Google will identify the locations that trigger automatic data deletion.
  • The company will not delete search requests automatically — users who want to protect themselves will have to do so themselves.
  • A sudden gap in location data could itself be used as evidence in court…(More)”.

How China uses search engines to spread propaganda


Blog by Jessica Brandt and Valerie Wirtschafter: “Users come to search engines seeking honest answers to their queries. On a wide range of issues—from personal health, to finance, to news—search engines are often the first stop for those looking to get information online. But as authoritarian states like China increasingly use online platforms to disseminate narratives aimed at weakening their democratic competitors, these search engines represent a crucial battleground in their information war with rivals. For Beijing, search engines represent a key—and underappreciated vector—to spread propaganda to audiences around the world.  

On a range of topics of geopolitical importance, Beijing has exploited search engine results to disseminate state-backed media that amplify the Chinese Communist Party’s propaganda. As we demonstrate in our recent report, published by the Brookings Institution in collaboration with the German Marshall Fund’s Alliance for Securing Democracy, users turning to search engines for information on Xinjiang, the site of the CCP’s egregious human rights abuses of the region’s Uyghur minority, or the origins of the coronavirus pandemic are surprisingly likely to encounter articles on these topics published by Chinese state-media outlets. By prominently surfacing this type of content, search engines may play a key role in Beijing’s effort to shape external perceptions, which makes it crucial that platforms—along with authoritative outlets that syndicate state-backed content without clear labeling—do more to address their role in spreading these narratives…(More)“.

Moral Expansiveness Around the World: The Role of Societal Factors Across 36 Countries


Paper by Kelly Kirkland et al: “What are the things that we think matter morally, and how do societal factors influence this? To date, research has explored several individual-level and historical factors that influence the size of our ‘moral circles.’ There has, however, been less attention focused on which societal factors play a role. We present the first multi-national exploration of moral expansiveness—that is, the size of people’s moral circles across countries. We found low generalized trust, greater perceptions of a breakdown in the social fabric of society, and greater perceived economic inequality were associated with smaller moral circles. Generalized trust also helped explain the effects of perceived inequality on lower levels of moral inclusiveness. Other inequality indicators (i.e., Gini coefficients) were, however, unrelated to moral expansiveness. These findings suggest that societal factors, especially those associated with generalized trust, may influence the size of our moral circles…(More)”.

Police Violence In Puerto Rico: Flooded With Data


Blog by Christine Grillo: “For María Mari-Narváez, a recent decision by the Supreme Court of Puerto Rico was both a victory and a moment of reckoning. The Court granted Kilómetro Cero, a citizen-led police accountability project in Puerto Rico, full access to every use-of-force report filed by the Puerto Rico Police Department since 2014. The decision will make it possible for advocates such as Mari to get a clear picture of how state police officers are using force, and when that use of force crosses the line into abuse. But the court victory flooded her small organization with data.

“We won, finally, and then I realized I was going to be receiving thousands of documents that I had zero capacity to process,” says Mari.

“One of the things that’s important to me when analyzing data is to find out where the gaps are, why those gaps exist, and what those gaps represent.” —Tarak Shah, data scientist

The Court made its decision in April 2021, and the police department started handing over PDF files in July. By the end, there could be up to 10,000 documents that get turned in. In addition to incident reports, the police had to provide their use-of-force database. Combined, the victory provides a complicated mixture of quantitative and qualitative data that can be analyzed to answer questions about what the state police are doing to its citizens during police interventions. In particular, Kilómetro Cero, which Mari founded, wants to find out if some Puerto Ricans are more likely to be victims of police violence than others.

“We’re looking for bias,” says Mari. “Bias against poor people, or people who live in a certain neighborhood. Gender bias. Language bias. Bias against drug users, sex workers, immigrants, people who don’t have a house. We’re trying to analyze the language of vulnerability.”…(More)”.

Non-human humanitarianism: when ‘AI for good’ can be harmful


Paper by Mirca Madianou: “Artificial intelligence (AI) applications have been introduced in humanitarian operations in order to help with the significant challenges the sector is facing. This article focuses on chatbots which have been proposed as an efficient method to improve communication with, and accountability to affected communities. Chatbots, together with other humanitarian AI applications such as biometrics, satellite imaging, predictive modelling and data visualisations, are often understood as part of the wider phenomenon of ‘AI for social good’. The article develops a decolonial critique of humanitarianism and critical algorithm studies which focuses on the power asymmetries underpinning both humanitarianism and AI. The article asks whether chatbots, as exemplars of ‘AI for good’, reproduce inequalities in the global context. Drawing on a mixed methods study that includes interviews with seven groups of stakeholders, the analysis observes that humanitarian chatbots do not fulfil claims such as ‘intelligence’. Yet AI applications still have powerful consequences. Apart from the risks associated with misinformation and data safeguarding, chatbots reduce communication to its barest instrumental forms which creates disconnects between affected communities and aid agencies. This disconnect is compounded by the extraction of value from data and experimentation with untested technologies. By reflecting the values of their designers and by asserting Eurocentric values in their programmed interactions, chatbots reproduce the coloniality of power. The article concludes that ‘AI for good’ is an ‘enchantment of technology’ that reworks the colonial legacies of humanitarianism whilst also occluding the power dynamics at play…(More)”.