Responsible Data for Children


New Site and Report by UNICEF and The GovLab: “RD4C seeks to build awareness regarding the need for special attention to data issues affecting children—especially in this age of changing technology and data linkage; and to engage with governments, communities, and development actors to put the best interests of children and a child rights approach at the center of our data activities. The right data in the right hands at the right time can significantly improve outcomes for children. The challenge is to understand the potential risks and ensure that the collection, analysis and use of data on children does not undermine these benefits.

Drawing upon field-based research and established good practice, RD4C aims to highlight and support best practice data responsibility; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.

How We Became Our Data


Book by Colin Koopman: “We are now acutely aware, as if all of the sudden, that data matters enormously to how we live. How did information come to be so integral to what we can do? How did we become people who effortlessly present our lives in social media profiles and who are meticulously recorded in state surveillance dossiers and online marketing databases? What is the story behind data coming to matter so much to who we are?


In How We Became Our Data, Colin Koopman excavates early moments of our rapidly accelerating data-tracking technologies and their consequences for how we think of and express our selfhood today. Koopman explores the emergence of mass-scale record-keeping systems like birth certificates and social security numbers, as well as new data techniques for categorizing personality traits, measuring intelligence, and even racializing subjects. This all culminates in what Koopman calls the “informational person” and the “informational power” we are now subject to. The recent explosion of digital technologies that are turning us into a series of algorithmic data points is shown to have a deeper and more turbulent past than we commonly think. Blending philosophy, history, political theory, and media theory in conversation with thinkers like Michel Foucault, Jürgen Habermas, and Friedrich Kittler, Koopman presents an illuminating perspective on how we have come to think of our personhood—and how we can resist its erosion….(More)”.

An Open Letter to Law School Deans about Privacy Law Education in Law Schools


Daniel Solove: “Recently a group of legal academics and practitioners in the field of privacy law sent a letter to the deans of all U.S. law schools about privacy law education in law schools.  My own brief intro about this endeavor is here in italics, followed by the letter. The signatories to the letter have signed onto the letter, not this italicized intro.

Although the field of privacy law grown dramatically in past two decades, education in law schools about privacy law has significantly lagged behind.  Most U.S. law schools lack a course on privacy law. Of those that have courses, many are small seminars, often taught by adjuncts.  Of the law schools that do have a privacy course, most often just have one course. Most schools lack a full-time faculty member who focuses substantially on privacy law.

This state of affairs is a great detriment to students. I am constantly approached by students and graduates from law schools across the country who are wondering how they can learn about privacy law and enter the field. Many express great disappointment at the lack of any courses, faculty, or activities at their schools.

After years of hoping that the legal academy would wake up and respond, I came to the realization that this wasn’t going to happen on its own. The following letter [click here for the PDF version] aims to make deans aware of the privacy law field. I hope that the letter is met with action….(More)”.

Americans’ views about privacy, surveillance and data-sharing


Pew Research Center: “In key ways, today’s digitally networked society runs on quid pro quos: People exchange details about themselves and their activities for services and products on the web or apps. Many are willing to accept the deals they are offered in return for sharing insight about their purchases, behaviors and social lives. At times, their personal information is collected by government on the grounds that there are benefits to public safety and security.

A majority of Americans are concerned about this collection and use of their data, according to a new report from Pew Research Center….

Americans vary in their attitudes toward data-sharing in the pursuit of public good. Though many Americans don’t think they benefit much from the collection of their data, and they find that the potential risks of this practice outweigh the benefits, there are some scenarios in which the public is more likely to accept the idea of data-sharing. In line with findings in a 2015 Center survey showing that some Americans are comfortable with trade-offs in sharing data, about half of U.S. adults (49%) say it is acceptable for the government to collect data about all Americans in order to assess potential terrorist threats. That compares with 31% who feel it is unacceptable to collect data about all Americans for that purpose. By contrast, just one-quarter say it is acceptable for smart speaker makers to share users’ audio recordings with law enforcement to help with criminal investigations, versus 49% who find that unacceptable….(More)”.

Google’s ‘Project Nightingale’ Gathers Personal Health Data on Millions of Americans


Rob Copeland at Wall Street Journal: “Google is engaged with one of the U.S.’s largest health-care systems on a project to collect and crunch the detailed personal-health information of millions of people across 21 states.

The initiative, code-named “Project Nightingale,” appears to be the biggest effort yet by a Silicon Valley giant to gain a toehold in the health-care industry through the handling of patients’ medical data. Amazon.com Inc., Apple Inc.  and Microsoft Corp. are also aggressively pushing into health care, though they haven’t yet struck deals of this scope.

Google began Project Nightingale in secret last year with St. Louis-based Ascension, a Catholic chain of 2,600 hospitals, doctors’ offices and other facilities, with the data sharing accelerating since summer, according to internal documents.

The data involved in the initiative encompasses lab results, doctor diagnoses and hospitalization records, among other categories, and amounts to a complete health history, including patient names and dates of birth….

Neither patients nor doctors have been notified. At least 150 Google employees already have access to much of the data on tens of millions of patients, according to a person familiar with the matter and the documents.

In a news release issued after The Wall Street Journal reported on Project Nightingale on Monday, the companies said the initiative is compliant with federal health law and includes robust protections for patient data….(More)”.

Kenya passes data protection law crucial for tech investments


George Obulutsa and Duncan Miriri at Reuters: “Kenyan President Uhuru Kenyatta on Friday approved a data protection law which complies with European Union legal standards as it looks to bolster investment in its information technology sector.

The East African nation has attracted foreign firms with innovations such as Safaricom’s M-Pesa mobile money services, but the lack of safeguards in handling personal data has held it back from its full potential, officials say.

“Kenya has joined the global community in terms of data protection standards,” Joe Mucheru, minister for information, technology and communication, told Reuters.

The new law sets out restrictions on how personally identifiable data obtained by firms and government entities can be handled, stored and shared, the government said.

Mucheru said it complies with the EU’s General Data Protection Regulation which came into effect in May 2018 and said an independent office will investigate data infringements….

A lack of data protection legislation has also hampered the government’s efforts to digitize identity records for citizens.

The registration, which the government said would boost its provision of services, suffered a setback this year when the exercise was challenged in court.

“The lack of a data privacy law has been an enormous lacuna in Kenya’s digital rights landscape,” said Nanjala Nyabola, author of a book on information technology and democracy in Kenya….(More)”.

The Ethical Algorithm: The Science of Socially Aware Algorithm Design


Book by Michael Kearns and Aaron Roth: “Over the course of a generation, algorithms have gone from mathematical abstractions to powerful mediators of daily life. Algorithms have made our lives more efficient, more entertaining, and, sometimes, better informed. At the same time, complex algorithms are increasingly violating the basic rights of individual citizens. Allegedly anonymized datasets routinely leak our most sensitive personal information; statistical models for everything from mortgages to college admissions reflect racial and gender bias. Meanwhile, users manipulate algorithms to “game” search engines, spam filters, online reviewing services, and navigation apps.

Understanding and improving the science behind the algorithms that run our lives is rapidly becoming one of the most pressing issues of this century. Traditional fixes, such as laws, regulations and watchdog groups, have proven woefully inadequate. Reporting from the cutting edge of scientific research, The Ethical Algorithm offers a new approach: a set of principled solutions based on the emerging and exciting science of socially aware algorithm design. Michael Kearns and Aaron Roth explain how we can better embed human principles into machine code – without halting the advance of data-driven scientific exploration. Weaving together innovative research with stories of citizens, scientists, and activists on the front lines, The Ethical Algorithm offers a compelling vision for a future, one in which we can better protect humans from the unintended impacts of algorithms while continuing to inspire wondrous advances in technology….(More)”.

The Value of Data: Towards a Framework to Redistribute It


Paper by Maria Savona: “This note attempts a systematisation of different pieces of literature that underpin the recent policy and academic debate on the value of data. It mainly poses foundational questions around the definition, economic nature and measurement of data value, and discusses the opportunity to redistribute it. It then articulates a framework to compare ways of implementing redistribution, distinguishing between data as capital, data as labour or data as an intellectual property. Each of these raises challenges, revolving around the notions of data property and data rights, that are also briefly discussed. The note concludes by indicating areas for policy considerations and a research agenda to shape the future structure of data governance more at large….(More)”.

Restrictions on Privacy and Exploitation in the Digital Economy: A Competition Law Perspective


Paper by Nicholas Economides and Ioannis Lianos: “The recent controversy on the intersection of competition law with the protection of privacy, following the emergence of big data and social media is a major challenge for competition authorities worldwide. Recent technological progress in data analytics may greatly facilitate the prediction of personality traits and attributes from even a few digital records of human behaviour.


There are different perspectives globally as to the level of personal data protection and the role competition law may play in this context, hence the discussion of integrating such concerns in competition law enforcement may be premature for some jurisdictions. However, a market failure approach may provide common intellectual foundations for the assessment of harms associated to the exploitation of personal data, even when the specific legal system does not formally recognize a fundamental right to privacy.


The paper presents a model of market failure based on a requirement provision in the acquisition of personal information from users of other products/services. We establish the economic harm from the market failure and the requirement using the traditional competition law toolbox and focusing more on situations in which the restriction on privacy may be analysed as a form of exploitation. Eliminating the requirement and the market failure by creating a functioning market for the sale of personal information is imperative. This emphasis on exploitation does not mean that restrictions on privacy may not result from exclusionary practices. However, we analyse this issue in a separate study.


Besides the traditional analysis of the requirement and market failure, we note that there are typically informational asymmetries between the data controller and the data subject. The latter may not be aware that his data was harvested, in the first place, or that the data will be processed by the data controller for a different purpose or shared and sold to third parties. The exploitation of personal data may also result from economic coercion, on the basis of resource-dependence or lock-in of the user, the latter having no other choice, in order to enjoy the consumption of a specific service provided by the data controller or its ecosystem, in particular in the presence of dominance, than to consent to the harvesting and use of his data. A behavioural approach would also emphasise the possible internalities (demand-side market failures) coming out of the bounded rationality, or the fact that people do not internalise all consequences of their actions and face limits in their cognitive capacities.
The paper also addresses the way competition law could engage with exploitative conduct leading to privacy harm, both for ex ante and ex post enforcement.


With regard to ex ante enforcement, the paper explores how privacy concerns may be integrated in merger control as part of the definition of product quality, the harm in question being merely exploitative (the possibility the data aggregation provides to the merged entity to exploit (personal) data in ways that harm directly consumers), rather than exclusionary (harming consumers by enabling the merged entity to marginalise a rival with better privacy policies), which is examined in a separate paper.


With regard to ex post enforcement, the paper explores different theories of harm that may give rise to competition law concerns and suggest specific tests for their assessment. In particular, we analyse old and new exploitative theories of harm relating to excessive data extraction, personalised pricing, unfair commercial practices and trading conditions, exploitative requirement contracts, behavioural manipulation.
We are in favour of collective action to restore the conditions of a well-functioning data market and the paper makes several policy recommendations….(More)”.

User Data as Public Resource: Implications for Social Media Regulation


Paper by Philip Napoli: “Revelations about the misuse and insecurity of user data gathered by social media platforms have renewed discussions about how best to characterize property rights in user data. At the same time, revelations about the use of social media platforms to disseminate disinformation and hate speech have prompted debates over the need for government regulation to assure that these platforms serve the public interest. These debates often hinge on whether any of the established rationales for media regulation apply to social media. This article argues that the public resource rationale that has been utilized in traditional media regulation in the United States applies to social media.

The public resource rationale contends that, when a media outlet utilizes a public resource—such as the broadcast spectrum, or public rights of way—the outlet must abide by certain public interest obligations that may infringe upon its First Amendment rights. This article argues that aggregate user data can be conceptualized as a public resource that triggers the application of a public interest regulatory framework to social media sites and other digital platforms that derive their revenue from the gathering, sharing, and monetization of massive aggregations of user data….(More)”.