Governance mechanisms for sharing of health data: An approach towards selecting attributes for complex discrete choice experiment studies


Paper by Jennifer Viberg Johansson: “Discrete Choice Experiment (DCE) is a well-established technique to elicit individual preferences, but it has rarely been used to elicit governance preferences for health data sharing.

The aim of this article was to describe the process of identifying attributes for a DCE study aiming to elicit preferences of citizens in Sweden, Iceland and the UK for governance mechanisms for digitally sharing different kinds of health data in different contexts.

A three-step approach was utilised to inform the attribute and level selection: 1) Attribute identification, 2) Attribute development and 3) Attribute refinement. First, we developed an initial set of potential attributes from a literature review and a workshop with experts. To further develop attributes, focus group discussions with citizens (n = 13), ranking exercises among focus group participants (n = 48) and expert interviews (n = 18) were performed. Thereafter, attributes were refined using group discussion (n = 3) with experts as well as cognitive interviews with citizens (n = 11).

The results led to the selection of seven attributes for further development: 1) level of identification, 2) the purpose of data use, 3) type of information, 4) consent, 5) new data user, 6) collector and 7) the oversight of data sharing. Differences were found between countries regarding the order of top three attributes. The process outlined participants’ conceptualisation of the chosen attributes, and what we learned for our attribute development phase.

This study demonstrates a process for selection of attributes for a (multi-country) DCE involving three stages: Attribute identification, Attribute development and Attribute refinement. This study can contribute to improve the ethical aspects and good practice of this phase in DCE studies. Specifically, it can contribute to the development of governance mechanisms in the digital world, where people’s health data are shared for multiple purposes….(More)”.

Privacy Tech’s Third Generation


“A Review of the Emerging Privacy Tech Sector” by Privacy Tech Alliance and Future of Privacy Forum: “As we enter the third phase of development of the privacy tech market, purchasers are demanding more integrated solutions, product offerings are more comprehensive, and startup valuations are higher than ever, according to a new report from the Future of Privacy Forum and Privacy Tech Alliance. These factors are leading to companies providing a wider range of services, acting as risk management platforms, and focusing on support of business outcomes.

According to the report, “Privacy Tech’s Third Generation: A Review of the Emerging Privacy Tech Sector,” regulations are often the biggest driver for buyers’ initial privacy tech purchases. Organizations also are deploying tools to mitigate potential harms from the use of data. However, buyers serving global markets increasingly need privacy tech that offers data availability and control and supports its utility, in addition to regulatory compliance. 

The report finds the COVID-19 pandemic has accelerated global marketplace adoption of privacy tech as dependence on digital technologies grows. Privacy is becoming a competitive differentiator in some sectors, and TechCrunch reports that 200+ privacy startups have together raised more than $3.5 billion over hundreds of individual rounds of funding….(More)”.

Privacy and Data Protection in Academia


Report by IAPP: “Today, demand for qualified privacy professionals is surging. Soon, societal, business and government needs for practitioners with expertise in the legal, technical and business underpinnings of data protection could far outstrip supply. To fill this gap, universities around the world are adding privacy curricula in their law, business and computer science schools. The IAPP’s Westin Research Center has catalogued these programs with the aim of promoting, catalyzing and supporting academia’s growing efforts to build an on-ramp to the privacy profession.

The information presented in our inaugural issue of “Privacy and Data Protection in Academia, A Global Guide to Curricula” represents the results of our publicly available survey. The programs included voluntarily completed the survey. The IAPP then organized the information provided and the designated contact at each institution verified the accu­racy of the information presented.

This is not a comprehen­sive list of colleges and universities offering privacy and data protection related curric­ula. We encourage higher education institu­tions interested in being included to com­plete the survey as the IAPP will periodically publish updates….(More)”.

Collective data rights can stop big tech from obliterating privacy


Article by Martin Tisne: “…There are two parallel approaches that should be pursued to protect the public.

One is better use of class or group actions, otherwise known as collective redress actions. Historically, these have been limited in Europe, but in November 2020 the European parliament passed a measure that requires all 27 EU member states to implement measures allowing for collective redress actions across the region. Compared with the US, the EU has stronger laws protecting consumer data and promoting competition, so class or group action lawsuits in Europe can be a powerful tool for lawyers and activists to force big tech companies to change their behavior even in cases where the per-person damages would be very low.

Class action lawsuits have most often been used in the US to seek financial damages, but they can also be used to force changes in policy and practice. They can work hand in hand with campaigns to change public opinion, especially in consumer cases (for example, by forcing Big Tobacco to admit to the link between smoking and cancer, or by paving the way for car seatbelt laws). They are powerful tools when there are thousands, if not millions, of similar individual harms, which add up to help prove causation. Part of the problem is getting the right information to sue in the first place. Government efforts, like a lawsuit brought against Facebook in December by the Federal Trade Commission (FTC) and a group of 46 states, are crucial. As the tech journalist Gilad Edelman puts it, “According to the lawsuits, the erosion of user privacy over time is a form of consumer harm—a social network that protects user data less is an inferior product—that tips Facebook from a mere monopoly to an illegal one.” In the US, as the New York Times recently reported, private lawsuits, including class actions, often “lean on evidence unearthed by the government investigations.” In the EU, however, it’s the other way around: private lawsuits can open up the possibility of regulatory action, which is constrained by the gap between EU-wide laws and national regulators.

Which brings us to the second approach: a little-known 2016 French law called the Digital Republic Bill. The Digital Republic Bill is one of the few modern laws focused on automated decision making. The law currently applies only to administrative decisions taken by public-sector algorithmic systems. But it provides a sketch for what future laws could look like. It says that the source code behind such systems must be made available to the public. Anyone can request that code.

Importantly, the law enables advocacy organizations to request information on the functioning of an algorithm and the source code behind it even if they don’t represent a specific individual or claimant who is allegedly harmed. The need to find a “perfect plaintiff” who can prove harm in order to file a suit makes it very difficult to tackle the systemic issues that cause collective data harms. Laure Lucchesi, the director of Etalab, a French government office in charge of overseeing the bill, says that the law’s focus on algorithmic accountability was ahead of its time. Other laws, like the European General Data Protection Regulation (GDPR), focus too heavily on individual consent and privacy. But both the data and the algorithms need to be regulated…(More)”

Did the GDPR increase trust in data collectors? Evidence from observational and experimental data


Paper by Paul C. Bauer, Frederic Gerdon, Florian Keusch, Frauke Kreuter & David Vannette: “In the wake of the digital revolution and connected technologies, societies store an ever-increasing amount of data on humans, their preferences, and behavior. These modern technologies create a trust challenge, insofar as individuals have to trust data collectors such as private organizations, government institutions, and researchers that their data is not misused. Privacy regulations should increase trust because they provide laws that increase transparency and allow for punishment in cases in which the trustee violates trust. The introduction of the General Data Protection Regulation (GDPR) in May 2018 – a wide-reaching regulation in EU law on data protection and privacy that covers millions of individuals in Europe – provides a unique setting to study the impact of privacy regulation on trust in data collectors. We collected survey panel data in Germany around the implementation date and ran a survey experiment with a GDPR information treatment. Our observational and experimental evidence does not support the hypothesis that the GDPR has positively affected trust. This finding and our discussion of the underlying reasons are relevant for the wider research field of trust, privacy, and big data….(More)”

We know what you did during lockdown


An FT Film written by James Graham: “The Covid-19 pandemic has so scrambled our lives that we have barely blinked when the state has told us how many people can attend a wedding, where we can travel or even whether we should hug each other. This normalisation of the abnormal, during the moral panic of a national healthcare emergency, is the subject of People You May Know, a short film written by the playwright James Graham and commissioned by the Financial Times.

One of Britain’s most inquisitive and versatile playwrights, Graham says he has long been worried about the expansion of the “creeping data state” and has an almost “existential anxiety about privacy on all levels, emotional, philosophical, political, social”. Those concerns were first explored in his play Privacy (2014) in response to the revelations of Edward Snowden, the US security contractor turned whistleblower, who described how “the architecture of oppression” of the surveillance state had been built, if not yet fully utilised. 

In his new FT film, Graham investigates how the response to the pandemic has enabled the further intrusion of the data state and what it might mean for us all. “The power of drama is that it allows you to take a few more stepping stones into the imagined future,” he says in a Google Meet interview. …(More) (Film)”

Enabling Trusted Data Collaboration in Society


Launch of Public Beta of the Data Responsibility Journey Mapping Tool: “Data Collaboratives, the purpose-driven reuse of data in the public interest, have demonstrated their ability to unlock the societal value of siloed data and create real-world impacts. Data collaboration has been key in generating new insights and action in areas like public healtheducationcrisis response, and economic development, to name a few. Designing and deploying a data collaborative, however, is a complex undertaking, subject to risks of misuse of data as well as missed use of data that could have provided public value if used effectively and responsibly.

Today, The GovLab is launching the public beta of a new tool intended to help Data Stewards — responsible data leaders across sectors — and other decision-makers assess and mitigate risks across the life cycle of a data collaborative. The Data Responsibility Journey is an assessment tool for Data Stewards to identify and mitigate risks, establish trust, and maximize the value of their work. Informed by The GovLab’s long standing research and practice in the field, and myriad consultations with data responsibility experts across regions and contexts, the tool aims to support decision-making in public agencies, civil society organizations, large businesses, small businesses, and humanitarian and development organizations, in particular.

The Data Responsibility Journey guides users through important questions and considerations across the lifecycle of data stewardship and collaboration: Planning, Collecting, Processing, Sharing, Analyzing, and Using. For each stage, users are asked to consider whether important data responsibility issues have been taken into account as part of their implementation strategy. When users flag an issue as in need of more attention, it is automatically added to a customized data responsibility strategy report providing actionable recommendations, relevant tools and resources, and key internal and external stakeholders that could be engaged to help operationalize these data responsibility actions…(More)”.

The EU General Data Protection Regulation: A Commentary/Update of Selected Articles


Open Access Book edited by C. Kuner, L.A. Bygrave and C. Docksey et al: ” provides an update for selected articles of the GDPR Commentary published in 2020 by Oxford University Press. It covers developments between the last date of coverage of the Commentary (1 August 2019) and 1 January 2021 (with a few exceptions when later developments are taken into account). Edited by Christopher Kuner, Lee A. Bygrave, Chris Docksey, Laura Drechsler, and Luca Tosoni, it covers 49 articles of the GDPR, and is being made freely accessible with the kind permission of Oxford University Press. It also includes two appendices that cover the same period as the rest of this update: the first deals with judgments of the European courts and some selected judgments of particular importance from national courts, and the second with EDPB papers…(More)”

Responsible Data Science


Book by Peter Bruce and Grant Fleming: “The increasing popularity of data science has resulted in numerous well-publicized cases of bias, injustice, and discrimination. The widespread deployment of “Black box” algorithms that are difficult or impossible to understand and explain, even for their developers, is a primary source of these unanticipated harms, making modern techniques and methods for manipulating large data sets seem sinister, even dangerous. When put in the hands of authoritarian governments, these algorithms have enabled suppression of political dissent and persecution of minorities. To prevent these harms, data scientists everywhere must come to understand how the algorithms that they build and deploy may harm certain groups or be unfair.

Responsible Data Science delivers a comprehensive, practical treatment of how to implement data science solutions in an even-handed and ethical manner that minimizes the risk of undue harm to vulnerable members of society. Both data science practitioners and managers of analytics teams will learn how to:

  • Improve model transparency, even for black box models
  • Diagnose bias and unfairness within models using multiple metrics
  • Audit projects to ensure fairness and minimize the possibility of unintended harm…(More)”

Governing Privacy in Knowledge Commons


Open Access Book edited by Madelyn Rose Sanfilippo et al: “…explores how privacy impacts knowledge production, community formation, and collaborative governance in diverse contexts, ranging from academia and IoT, to social media and mental health. Using nine new case studies and a meta-analysis of previous knowledge commons literature, the book integrates the Governing Knowledge Commons framework with Helen Nissenbaum’s Contextual Integrity framework. The multidisciplinary case studies show that personal information is often a key component of the resources created by knowledge commons. Moreover, even when it is not the focus of the commons, personal information governance may require community participation and boundaries. Taken together, the chapters illustrate the importance of exit and voice in constructing and sustaining knowledge commons through appropriate personal information flows. They also shed light on the shortcomings of current notice-and-consent style regulation of social media platforms….(More)”.