Is there a role for consent in privacy?


Article by Robert Gellman: “After decades, we still talk about the role of notice and choice in privacy. Yet there seems to be broad recognition that notice and choice do nothing for the privacy of consumers. Some American businesses cling to notice and choice because they hate all the alternatives. Some legislators draft laws with elements of notice and choice, either because it’s easier to draft a law that way, because they don’t know any better or because they carry water for business.

For present purposes, I will talk about notice and choice generically as consent. Consent is a broader concept than choice, but the difference doesn’t matter for the point I want to make. How you frame consent is complex. There are many alternatives and many approaches. It’s not just a matter of opt-in or opt-out. While I’m discarding issues, I also want to acknowledge and set aside the eight basic Fair Information Practices. There is no notice and choice principle in FIPS, and FIPs are not specifically important here.

Until recently, my view was that consent in almost any form is pretty much death for consumer privacy. No matter how you structure it, websites and others will find a way to wheedle consent from consumers. Those who want to exploit consumer data will cajole, pressure, threaten, mystify, obscure, entice or otherwise coax consumers to agree.

Suddenly, I’m not as sure of my conclusion about consent. What changed my mind? There is a new data point from Apple’s App Tracking Transparency framework. Apple requires mobile application developers to obtain opt-in consent before serving targeted advertising via Apple’s Identifier for Advertisers. Early reports suggest consumers are saying “NO” in overwhelming numbers — overwhelming as in more than 90%.

It isn’t this strong consumer reaction that makes me think consent might possibly have a place. I want to highlight a different aspect of the Apple framework….(More)”.

ASEAN Data Management Framework


ASEAN Framework: “Due to the growing interactions between data, connected things and people, trust in data has become the pre-condition for fully realising the gains of digital transformation. SMEs are threading a fine line between balancing digital initiatives and concurrently managing data protection and customer privacy safeguards to ensure that these do not impede innovation. Therefore, there is a motivation to focus on digital data governance as it is critical to boost economic integration and technology adoption across all sectors in the ten ASEAN Member States (AMS).
To ensure that their data is appropriately managed and protected, organisations need to know what levels of technical, procedural and physical controls they need to put in place. The categorisation of datasets help organisations manage their data assets and put in place the right level of controls. This is applicable for both data at rest as well as data in transit. The establishment of an ASEAN Data Management Framework will promote sound data governance practices by helping organisations to discover the datasets they have, assign it with the appropriate categories, manage the data, protect it accordingly and all these while continuing to comply with relevant regulations. Improved governance and protection will instil trust in data sharing both between organisations and between countries, which will then promote the growth of trade and the flow of data among AMS and their partners in the digital economy….(More)”

Unity in Privacy Diversity: A Kaleidoscopic View of Privacy Definitions


Paper by Bert-Jaap Koops and Maša Galič: “Contrary to the common claim that privacy is a concept in disarray, this Article argues that there is considerable coherence in the way privacy has been conceptualized over many decades of privacy scholarship. Seemingly disparate approaches and widely differing definitions actually share close family resemblances that, viewed together from a bird’s-eye perspective, suggest that the concept of privacy is more akin to a kaleidoscope than to a swamp. As a heuristic device to look at this kaleidoscope, we use a working definition of privacy as having spaces in which you can be yourselves, building on two major strands in privacy theory: identity-building and boundary-management. With this heuristic, we analyze how six authoritative privacy accounts can be understood in the terms and rationale of other definitions. We show how the notions of Cohen (room for boundary management), Johnson (freedom from others’ judgement), Nissenbaum (contextual integrity), Reiman (personhood), Warren and Brandeis (being let alone), and Westin (control over information) have significant overlap with—or may even be equivalent to—an understanding of privacy in terms of identity-fostering spaces. Our kaleidoscopic perspective highlights not only that there is coherence in privacy, but also helps to understand the function and value of having many different privacy definitions around: each time and context bring their own privacy-related challenges, which might best be addressed through a certain conceptualization of privacy that works in that particular context. As the world turns its kaleidoscope of emerging privacy issues, privacy scholarship turns its kaleidoscope of privacy definitions along. The result of this kaleidoscopic perspective on privacy is an illuminating picture of unity in diversity….(More)”.

Governance mechanisms for sharing of health data: An approach towards selecting attributes for complex discrete choice experiment studies


Paper by Jennifer Viberg Johansson: “Discrete Choice Experiment (DCE) is a well-established technique to elicit individual preferences, but it has rarely been used to elicit governance preferences for health data sharing.

The aim of this article was to describe the process of identifying attributes for a DCE study aiming to elicit preferences of citizens in Sweden, Iceland and the UK for governance mechanisms for digitally sharing different kinds of health data in different contexts.

A three-step approach was utilised to inform the attribute and level selection: 1) Attribute identification, 2) Attribute development and 3) Attribute refinement. First, we developed an initial set of potential attributes from a literature review and a workshop with experts. To further develop attributes, focus group discussions with citizens (n = 13), ranking exercises among focus group participants (n = 48) and expert interviews (n = 18) were performed. Thereafter, attributes were refined using group discussion (n = 3) with experts as well as cognitive interviews with citizens (n = 11).

The results led to the selection of seven attributes for further development: 1) level of identification, 2) the purpose of data use, 3) type of information, 4) consent, 5) new data user, 6) collector and 7) the oversight of data sharing. Differences were found between countries regarding the order of top three attributes. The process outlined participants’ conceptualisation of the chosen attributes, and what we learned for our attribute development phase.

This study demonstrates a process for selection of attributes for a (multi-country) DCE involving three stages: Attribute identification, Attribute development and Attribute refinement. This study can contribute to improve the ethical aspects and good practice of this phase in DCE studies. Specifically, it can contribute to the development of governance mechanisms in the digital world, where people’s health data are shared for multiple purposes….(More)”.

Privacy Tech’s Third Generation


“A Review of the Emerging Privacy Tech Sector” by Privacy Tech Alliance and Future of Privacy Forum: “As we enter the third phase of development of the privacy tech market, purchasers are demanding more integrated solutions, product offerings are more comprehensive, and startup valuations are higher than ever, according to a new report from the Future of Privacy Forum and Privacy Tech Alliance. These factors are leading to companies providing a wider range of services, acting as risk management platforms, and focusing on support of business outcomes.

According to the report, “Privacy Tech’s Third Generation: A Review of the Emerging Privacy Tech Sector,” regulations are often the biggest driver for buyers’ initial privacy tech purchases. Organizations also are deploying tools to mitigate potential harms from the use of data. However, buyers serving global markets increasingly need privacy tech that offers data availability and control and supports its utility, in addition to regulatory compliance. 

The report finds the COVID-19 pandemic has accelerated global marketplace adoption of privacy tech as dependence on digital technologies grows. Privacy is becoming a competitive differentiator in some sectors, and TechCrunch reports that 200+ privacy startups have together raised more than $3.5 billion over hundreds of individual rounds of funding….(More)”.

Privacy and Data Protection in Academia


Report by IAPP: “Today, demand for qualified privacy professionals is surging. Soon, societal, business and government needs for practitioners with expertise in the legal, technical and business underpinnings of data protection could far outstrip supply. To fill this gap, universities around the world are adding privacy curricula in their law, business and computer science schools. The IAPP’s Westin Research Center has catalogued these programs with the aim of promoting, catalyzing and supporting academia’s growing efforts to build an on-ramp to the privacy profession.

The information presented in our inaugural issue of “Privacy and Data Protection in Academia, A Global Guide to Curricula” represents the results of our publicly available survey. The programs included voluntarily completed the survey. The IAPP then organized the information provided and the designated contact at each institution verified the accu­racy of the information presented.

This is not a comprehen­sive list of colleges and universities offering privacy and data protection related curric­ula. We encourage higher education institu­tions interested in being included to com­plete the survey as the IAPP will periodically publish updates….(More)”.

Collective data rights can stop big tech from obliterating privacy


Article by Martin Tisne: “…There are two parallel approaches that should be pursued to protect the public.

One is better use of class or group actions, otherwise known as collective redress actions. Historically, these have been limited in Europe, but in November 2020 the European parliament passed a measure that requires all 27 EU member states to implement measures allowing for collective redress actions across the region. Compared with the US, the EU has stronger laws protecting consumer data and promoting competition, so class or group action lawsuits in Europe can be a powerful tool for lawyers and activists to force big tech companies to change their behavior even in cases where the per-person damages would be very low.

Class action lawsuits have most often been used in the US to seek financial damages, but they can also be used to force changes in policy and practice. They can work hand in hand with campaigns to change public opinion, especially in consumer cases (for example, by forcing Big Tobacco to admit to the link between smoking and cancer, or by paving the way for car seatbelt laws). They are powerful tools when there are thousands, if not millions, of similar individual harms, which add up to help prove causation. Part of the problem is getting the right information to sue in the first place. Government efforts, like a lawsuit brought against Facebook in December by the Federal Trade Commission (FTC) and a group of 46 states, are crucial. As the tech journalist Gilad Edelman puts it, “According to the lawsuits, the erosion of user privacy over time is a form of consumer harm—a social network that protects user data less is an inferior product—that tips Facebook from a mere monopoly to an illegal one.” In the US, as the New York Times recently reported, private lawsuits, including class actions, often “lean on evidence unearthed by the government investigations.” In the EU, however, it’s the other way around: private lawsuits can open up the possibility of regulatory action, which is constrained by the gap between EU-wide laws and national regulators.

Which brings us to the second approach: a little-known 2016 French law called the Digital Republic Bill. The Digital Republic Bill is one of the few modern laws focused on automated decision making. The law currently applies only to administrative decisions taken by public-sector algorithmic systems. But it provides a sketch for what future laws could look like. It says that the source code behind such systems must be made available to the public. Anyone can request that code.

Importantly, the law enables advocacy organizations to request information on the functioning of an algorithm and the source code behind it even if they don’t represent a specific individual or claimant who is allegedly harmed. The need to find a “perfect plaintiff” who can prove harm in order to file a suit makes it very difficult to tackle the systemic issues that cause collective data harms. Laure Lucchesi, the director of Etalab, a French government office in charge of overseeing the bill, says that the law’s focus on algorithmic accountability was ahead of its time. Other laws, like the European General Data Protection Regulation (GDPR), focus too heavily on individual consent and privacy. But both the data and the algorithms need to be regulated…(More)”

Did the GDPR increase trust in data collectors? Evidence from observational and experimental data


Paper by Paul C. Bauer, Frederic Gerdon, Florian Keusch, Frauke Kreuter & David Vannette: “In the wake of the digital revolution and connected technologies, societies store an ever-increasing amount of data on humans, their preferences, and behavior. These modern technologies create a trust challenge, insofar as individuals have to trust data collectors such as private organizations, government institutions, and researchers that their data is not misused. Privacy regulations should increase trust because they provide laws that increase transparency and allow for punishment in cases in which the trustee violates trust. The introduction of the General Data Protection Regulation (GDPR) in May 2018 – a wide-reaching regulation in EU law on data protection and privacy that covers millions of individuals in Europe – provides a unique setting to study the impact of privacy regulation on trust in data collectors. We collected survey panel data in Germany around the implementation date and ran a survey experiment with a GDPR information treatment. Our observational and experimental evidence does not support the hypothesis that the GDPR has positively affected trust. This finding and our discussion of the underlying reasons are relevant for the wider research field of trust, privacy, and big data….(More)”

We know what you did during lockdown


An FT Film written by James Graham: “The Covid-19 pandemic has so scrambled our lives that we have barely blinked when the state has told us how many people can attend a wedding, where we can travel or even whether we should hug each other. This normalisation of the abnormal, during the moral panic of a national healthcare emergency, is the subject of People You May Know, a short film written by the playwright James Graham and commissioned by the Financial Times.

One of Britain’s most inquisitive and versatile playwrights, Graham says he has long been worried about the expansion of the “creeping data state” and has an almost “existential anxiety about privacy on all levels, emotional, philosophical, political, social”. Those concerns were first explored in his play Privacy (2014) in response to the revelations of Edward Snowden, the US security contractor turned whistleblower, who described how “the architecture of oppression” of the surveillance state had been built, if not yet fully utilised. 

In his new FT film, Graham investigates how the response to the pandemic has enabled the further intrusion of the data state and what it might mean for us all. “The power of drama is that it allows you to take a few more stepping stones into the imagined future,” he says in a Google Meet interview. …(More) (Film)”

Enabling Trusted Data Collaboration in Society


Launch of Public Beta of the Data Responsibility Journey Mapping Tool: “Data Collaboratives, the purpose-driven reuse of data in the public interest, have demonstrated their ability to unlock the societal value of siloed data and create real-world impacts. Data collaboration has been key in generating new insights and action in areas like public healtheducationcrisis response, and economic development, to name a few. Designing and deploying a data collaborative, however, is a complex undertaking, subject to risks of misuse of data as well as missed use of data that could have provided public value if used effectively and responsibly.

Today, The GovLab is launching the public beta of a new tool intended to help Data Stewards — responsible data leaders across sectors — and other decision-makers assess and mitigate risks across the life cycle of a data collaborative. The Data Responsibility Journey is an assessment tool for Data Stewards to identify and mitigate risks, establish trust, and maximize the value of their work. Informed by The GovLab’s long standing research and practice in the field, and myriad consultations with data responsibility experts across regions and contexts, the tool aims to support decision-making in public agencies, civil society organizations, large businesses, small businesses, and humanitarian and development organizations, in particular.

The Data Responsibility Journey guides users through important questions and considerations across the lifecycle of data stewardship and collaboration: Planning, Collecting, Processing, Sharing, Analyzing, and Using. For each stage, users are asked to consider whether important data responsibility issues have been taken into account as part of their implementation strategy. When users flag an issue as in need of more attention, it is automatically added to a customized data responsibility strategy report providing actionable recommendations, relevant tools and resources, and key internal and external stakeholders that could be engaged to help operationalize these data responsibility actions…(More)”.