The Economics of Digital Privacy


Paper by Avi Goldfarb & Verina F. Que: “There has been increasing attention to privacy in the media and in regulatory discussions. This is a consequence of the increased usefulness of digital data. The literature has emphasized the benefits and costs of digital data flows to consumers and firms. The benefits arise in the form of data-driven innovation, higher quality products and services that match consumer needs, and increased profits. The costs relate to intrinsic and instrumental values of privacy. Under standard economic assumptions, this framing of a cost-benefit tradeoff might suggest little role for regulation beyond ensuring consumers are appropriately informed in a robust competitive environment. The empirical literature thus far has focused on this direct cost-benefit assessment, examining how privacy regulations have affected various market outcomes. However, an increasing body of theory work emphasizes externalities related to data flows. These externalities, both positive and negative, suggest benefits to the targeted regulation of digital privacy…(More)”.

UN Guide on Privacy-Enhancing Technologies for Official Statistics


UN Guide: “This document presents methodologies and approaches to mitigating privacy risks when using sensitive or confidential data, which are collectively referred to as privacy-enhancing technologies (PETs). National Statistics Offices (NSOs) are entrusted with data that has the potential to drive innovation and improve national services, research, and social benefit. Yet, there has been a rise in sustained cyber threats, complex networks of intermediaries motivated to procure sensitive data, and advances in methods to re-identify and link data to individuals and across multiple data sources. Data breaches erode public trust and can have serious negative consequences for individuals, groups, and communities. This document focuses on PETs that protect data during analysis and dissemination of sensitive information so that the benefits of using data for official statistics can be realized while minimizing privacy risks to those entrusting sensitive data to NSOs…(More)”.

Privacy


Book edited by Carissa Veliz and Steven M. Cahn: “Companies collect and share much of your daily life, from your location and search history, to your likes, habits, and relationships. As more and more of our personal data is collected, analyzed, and distributed, we need to think carefully about what we might be losing when we give up our privacy.

Privacy is a thought-provoking collection of philosophical essays on privacy, offering deep insights into the nature of privacy, its value, and the consequences of its loss. Bringing together both classic and contemporary work, this timely volume explores the theories, issues, debates, and applications of the philosophical study of privacy. The essays address concealment and exposure, the liberal value of privacy, privacy in social media, privacy rights and public information, privacy and the limits of law, and more…(More)”.

Americans Don’t Understand What Companies Can Do With Their Personal Data — and That’s a Problem


Press Release by the Annenberg School for Communications: “Have you ever had the experience of browsing for an item online, only to then see ads for it everywhere? Or watching a TV program, and suddenly your phone shows you an ad related to the topic? Marketers clearly know a lot about us, but the extent of what they know, how they know it, and what they’re legally allowed to know can feel awfully murky. 

In a new report, “Americans Can’t Consent to Companies’ Use of Their Data,” researchers asked a nationally representative group of more than 2,000 Americans to answer a set of questions about digital marketing policies and how companies can and should use their personal data. Their aim was to determine if current “informed consent” practices are working online. 

They found that the great majority of Americans don’t understand the fundamentals of internet marketing practices and policies, and that many feel incapable of consenting to how companies use their data. As a result, the researchers say, Americans can’t truly give informed consent to digital data collection.

The survey revealed that 56% of American adults don’t understand the term “privacy policy,” often believing it means that a company won’t share their data with third parties without permission. In actual fact, many of these policies state that a company can share or sell any data it gathers about site visitors with other websites or companies.

Perhaps because so many Americans feel that internet privacy feels impossible to comprehend — with “opting-out” or “opting-in,” biometrics, and VPNs — they don’t trust what is being done with their digital data. Eighty percent of Americans believe that what companies know about them can cause them harm.

“People don’t feel that they have the ability to protect their data online — even if they want to,” says lead researcher Joseph Turow, Robert Lewis Shayon Professor of Media Systems & Industries at the Annenberg School for Communication at the University of Pennsylvania….(More)”

Engineering Personal Data Sharing


Report by ENISA: “This report attempts to look closer at specific use cases relating to personal data sharing, primarily in the health sector, and discusses how specific technologies and considerations of implementation can support the meeting of specific data protection. After discussing some challenges in (personal) data sharing, this report demonstrates how to engineer specific technologies and techniques in order to enable privacy preserving data sharing. More specifically it discusses specific use cases for sharing data in the health sector, with the aim of demonstrating how data protection principles can be met through the proper use of technological solutions relying on advanced cryptographic techniques. Next it discusses data sharing that takes place as part of another process or service, where the data is processed through some secondary channel or entity before reaching its primary recipient. Lastly, it identifies challenges, considerations and possible architectural solutions on intervenability aspects (such as the right to erasure and the right to rectification when sharing data)…(More)”.

From privacy to partnership


Press Release: “The NHS and other public sector institutions should lead the way in piloting Privacy Enhancing Technologies (PETs) that could help unlock ‘lifesaving’ data without compromising privacy, a report by the Royal Society has said.

From privacy to partnership, the report from the UK’s national academy of science, highlights cases where better use of data could have significant public benefits – from cancer research to reaching net-zero carbon emissions.

PETs encompass a suite of tools, such as a new generation of encryption and synthetic data, that could help deliver those benefits by reducing risks inherent to data use. However, their adoption to date has been limited.

The report, which profiles public sector readiness for PETs, calls for public bodies to champion these technologies in partnership with small-and-medium-sized enterprises, and for the UK government to establish a ‘national strategy for the responsible use of PETs’.

This should support data use for public good through establishment of common standards for PETs, as well as bursaries and prizes to incentivise and accelerate development of a marketplace for their application.

Read the full report.

This builds on the Royal Society’s 2019 Protecting privacy in practice (PDF). Following rapid developments in the field, the new report aims to establish principles and standards for the responsible use of PETs. This includes ensuring PETs are not limited to private sector organisations but are also used in cross-sector data partnerships for collaborative analysis to achieve wider public benefit.

 Healthcare is a key use case identified by the report. Medical technology advances, coupled with comprehensive electronic patient records in the NHS and a strong academic research base, mean “the UK is well positioned to deliver timely and impactful health research and its translation to offer more effective treatments, track and prevent public health risks, utilising health data to improve and save lives,” the report said…(More)”.

The Signal App and the Danger of Privacy at All Costs


Article by Reid Blackman: “…One should always worry when a person or an organization places one value above all. The moral fabric of our world is complex. It’s nuanced. Sensitivity to moral nuance is difficult, but unwavering support of one principle to rule them all is morally dangerous.

The way Signal wields the word “surveillance” reflects its coarsegrained understanding of morality. To the company, surveillance covers everything from a server holding encrypted data that no one looks at to a law enforcement agent reading data after obtaining a warrant to East Germany randomly tapping citizens’ phones. One cannot think carefully about the value of privacy — including its relative importance to other values in particular contexts — with such a broad brush.

What’s more, the company’s proposition that if anyone has access to data, then many unauthorized people probably will have access to that data is false. This response reflects a lack of faith in good governance, which is essential to any well-functioning organization or community seeking to keep its members and society at large safe from bad actors. There are some people who have access to the nuclear launch codes, but “Mission Impossible” movies aside, we’re not particularly worried about a slippery slope leading to lots of unauthorized people having access to those codes.

I am drawing attention to Signal, but there’s a bigger issue here: Small groups of technologists are developing and deploying applications of their technologies for explicitly ideological reasons, with those ideologies baked into the technologies. To use those technologies is to use a tool that comes with an ethical or political bent.

Signal is pushing against businesses like Meta that turn users of their social media platforms into the product by selling user data. But Signal embeds within itself a rather extreme conception of privacy, and scaling its technology is scaling its ideology. Signal’s users may not be the product, but they ‌‌are the witting or unwitting advocates of the moral views of the 40 or so people who operate Signal.

There’s something somewhat sneaky in all this (though I don’t think the owners of Signal intend to be sneaky). Usually advocates know that they’re advocates. They engage in some level of deliberation and reach the conclusion that a set of beliefs is for them…(More)”.

Economic Research on Privacy Regulation: Lessons from the GDPR and Beyond


Paper by Garrett Johnson: “This paper reviews the economic literature on the European Union’s General Data Protection Regulation (GDPR). I highlight key challenges for studying the regulation including the difficulty of finding a suitable control group, variable firm compliance and regulatory enforcement, as well as the regulation’s impact on data observability. The economic literature on the GDPR to date has largely—though not universally—documented harms to firms. These harms include firm performance, innovation, competition, the web, and marketing. On the elusive consumer welfare side, the literature documents some objective privacy improvements as well as helpful survey evidence. The literature also examines the consequences of the GDPR’s design decisions and illuminates how the GDPR works in practice. Finally, I suggest opportunities for future research on the GDPR as well as privacy regulation and privacy-related innovation more broadly…(More)”.

Orbán used Hungarians’ COVID data to boost election campaign, report says


Article by Louis Westendarp: “Hungarian Prime Minister Viktor Orbán’s ruling party Fidesz used citizens’ data from COVID-19 vaccine signups to spread Fidesz campaign messages before Hungary’s election in April 2022, according to a report by Human Rights Watch.

Not only was data from vaccine jabs used to help Fidesz, but also data from tax benefits applications and association membership registrations. This violates privacy rights, said the report — and blurs the line between the ruling party and government resources in Hungary, which has repeatedly been warned by the EU to clean up its act regarding the rule of law.

“Using people’s personal data collected so they could access public services to bombard them with political campaign messages is a betrayal of trust and an abuse of power,” said Deborah Brown, senior technology researcher at Human Rights Watch…(More)”.

Our Data, Ourselves


Book by Jacqueline D. Lipton: “Our Data, Ourselves addresses a common and crucial question: What can we as private individuals do to protect our personal information in a digital world? In this practical handbook, legal expert Jacqueline D. Lipton guides readers through important issues involving technology, data collection, and digital privacy as they apply to our daily lives.

Our Data, Ourselves covers a broad range of everyday privacy concerns with easily digestible, accessible overviews and real-world examples. Lipton explores the ways we can protect our personal data and monitor its use by corporations, the government, and others. She also explains our rights regarding sensitive personal data like health insurance records and credit scores, as well as what information retailers can legally gather, and how. Who actually owns our personal information? Can an employer legally access personal emails? What privacy rights do we have on social media? Answering these questions and more, Our Data, Ourselves provides a strategic approach to assuming control over, and ultimately protecting, our personal information…(More)”