Americans Don’t Understand What Companies Can Do With Their Personal Data — and That’s a Problem


Press Release by the Annenberg School for Communications: “Have you ever had the experience of browsing for an item online, only to then see ads for it everywhere? Or watching a TV program, and suddenly your phone shows you an ad related to the topic? Marketers clearly know a lot about us, but the extent of what they know, how they know it, and what they’re legally allowed to know can feel awfully murky. 

In a new report, “Americans Can’t Consent to Companies’ Use of Their Data,” researchers asked a nationally representative group of more than 2,000 Americans to answer a set of questions about digital marketing policies and how companies can and should use their personal data. Their aim was to determine if current “informed consent” practices are working online. 

They found that the great majority of Americans don’t understand the fundamentals of internet marketing practices and policies, and that many feel incapable of consenting to how companies use their data. As a result, the researchers say, Americans can’t truly give informed consent to digital data collection.

The survey revealed that 56% of American adults don’t understand the term “privacy policy,” often believing it means that a company won’t share their data with third parties without permission. In actual fact, many of these policies state that a company can share or sell any data it gathers about site visitors with other websites or companies.

Perhaps because so many Americans feel that internet privacy feels impossible to comprehend — with “opting-out” or “opting-in,” biometrics, and VPNs — they don’t trust what is being done with their digital data. Eighty percent of Americans believe that what companies know about them can cause them harm.

“People don’t feel that they have the ability to protect their data online — even if they want to,” says lead researcher Joseph Turow, Robert Lewis Shayon Professor of Media Systems & Industries at the Annenberg School for Communication at the University of Pennsylvania….(More)”

Engineering Personal Data Sharing


Report by ENISA: “This report attempts to look closer at specific use cases relating to personal data sharing, primarily in the health sector, and discusses how specific technologies and considerations of implementation can support the meeting of specific data protection. After discussing some challenges in (personal) data sharing, this report demonstrates how to engineer specific technologies and techniques in order to enable privacy preserving data sharing. More specifically it discusses specific use cases for sharing data in the health sector, with the aim of demonstrating how data protection principles can be met through the proper use of technological solutions relying on advanced cryptographic techniques. Next it discusses data sharing that takes place as part of another process or service, where the data is processed through some secondary channel or entity before reaching its primary recipient. Lastly, it identifies challenges, considerations and possible architectural solutions on intervenability aspects (such as the right to erasure and the right to rectification when sharing data)…(More)”.

From privacy to partnership


Press Release: “The NHS and other public sector institutions should lead the way in piloting Privacy Enhancing Technologies (PETs) that could help unlock ‘lifesaving’ data without compromising privacy, a report by the Royal Society has said.

From privacy to partnership, the report from the UK’s national academy of science, highlights cases where better use of data could have significant public benefits – from cancer research to reaching net-zero carbon emissions.

PETs encompass a suite of tools, such as a new generation of encryption and synthetic data, that could help deliver those benefits by reducing risks inherent to data use. However, their adoption to date has been limited.

The report, which profiles public sector readiness for PETs, calls for public bodies to champion these technologies in partnership with small-and-medium-sized enterprises, and for the UK government to establish a ‘national strategy for the responsible use of PETs’.

This should support data use for public good through establishment of common standards for PETs, as well as bursaries and prizes to incentivise and accelerate development of a marketplace for their application.

Read the full report.

This builds on the Royal Society’s 2019 Protecting privacy in practice (PDF). Following rapid developments in the field, the new report aims to establish principles and standards for the responsible use of PETs. This includes ensuring PETs are not limited to private sector organisations but are also used in cross-sector data partnerships for collaborative analysis to achieve wider public benefit.

 Healthcare is a key use case identified by the report. Medical technology advances, coupled with comprehensive electronic patient records in the NHS and a strong academic research base, mean “the UK is well positioned to deliver timely and impactful health research and its translation to offer more effective treatments, track and prevent public health risks, utilising health data to improve and save lives,” the report said…(More)”.

The Signal App and the Danger of Privacy at All Costs


Article by Reid Blackman: “…One should always worry when a person or an organization places one value above all. The moral fabric of our world is complex. It’s nuanced. Sensitivity to moral nuance is difficult, but unwavering support of one principle to rule them all is morally dangerous.

The way Signal wields the word “surveillance” reflects its coarsegrained understanding of morality. To the company, surveillance covers everything from a server holding encrypted data that no one looks at to a law enforcement agent reading data after obtaining a warrant to East Germany randomly tapping citizens’ phones. One cannot think carefully about the value of privacy — including its relative importance to other values in particular contexts — with such a broad brush.

What’s more, the company’s proposition that if anyone has access to data, then many unauthorized people probably will have access to that data is false. This response reflects a lack of faith in good governance, which is essential to any well-functioning organization or community seeking to keep its members and society at large safe from bad actors. There are some people who have access to the nuclear launch codes, but “Mission Impossible” movies aside, we’re not particularly worried about a slippery slope leading to lots of unauthorized people having access to those codes.

I am drawing attention to Signal, but there’s a bigger issue here: Small groups of technologists are developing and deploying applications of their technologies for explicitly ideological reasons, with those ideologies baked into the technologies. To use those technologies is to use a tool that comes with an ethical or political bent.

Signal is pushing against businesses like Meta that turn users of their social media platforms into the product by selling user data. But Signal embeds within itself a rather extreme conception of privacy, and scaling its technology is scaling its ideology. Signal’s users may not be the product, but they ‌‌are the witting or unwitting advocates of the moral views of the 40 or so people who operate Signal.

There’s something somewhat sneaky in all this (though I don’t think the owners of Signal intend to be sneaky). Usually advocates know that they’re advocates. They engage in some level of deliberation and reach the conclusion that a set of beliefs is for them…(More)”.

Economic Research on Privacy Regulation: Lessons from the GDPR and Beyond


Paper by Garrett Johnson: “This paper reviews the economic literature on the European Union’s General Data Protection Regulation (GDPR). I highlight key challenges for studying the regulation including the difficulty of finding a suitable control group, variable firm compliance and regulatory enforcement, as well as the regulation’s impact on data observability. The economic literature on the GDPR to date has largely—though not universally—documented harms to firms. These harms include firm performance, innovation, competition, the web, and marketing. On the elusive consumer welfare side, the literature documents some objective privacy improvements as well as helpful survey evidence. The literature also examines the consequences of the GDPR’s design decisions and illuminates how the GDPR works in practice. Finally, I suggest opportunities for future research on the GDPR as well as privacy regulation and privacy-related innovation more broadly…(More)”.

Orbán used Hungarians’ COVID data to boost election campaign, report says


Article by Louis Westendarp: “Hungarian Prime Minister Viktor Orbán’s ruling party Fidesz used citizens’ data from COVID-19 vaccine signups to spread Fidesz campaign messages before Hungary’s election in April 2022, according to a report by Human Rights Watch.

Not only was data from vaccine jabs used to help Fidesz, but also data from tax benefits applications and association membership registrations. This violates privacy rights, said the report — and blurs the line between the ruling party and government resources in Hungary, which has repeatedly been warned by the EU to clean up its act regarding the rule of law.

“Using people’s personal data collected so they could access public services to bombard them with political campaign messages is a betrayal of trust and an abuse of power,” said Deborah Brown, senior technology researcher at Human Rights Watch…(More)”.

Our Data, Ourselves


Book by Jacqueline D. Lipton: “Our Data, Ourselves addresses a common and crucial question: What can we as private individuals do to protect our personal information in a digital world? In this practical handbook, legal expert Jacqueline D. Lipton guides readers through important issues involving technology, data collection, and digital privacy as they apply to our daily lives.

Our Data, Ourselves covers a broad range of everyday privacy concerns with easily digestible, accessible overviews and real-world examples. Lipton explores the ways we can protect our personal data and monitor its use by corporations, the government, and others. She also explains our rights regarding sensitive personal data like health insurance records and credit scores, as well as what information retailers can legally gather, and how. Who actually owns our personal information? Can an employer legally access personal emails? What privacy rights do we have on social media? Answering these questions and more, Our Data, Ourselves provides a strategic approach to assuming control over, and ultimately protecting, our personal information…(More)”

Ethical Considerations in Re-Using Private Sector Data for Migration-Related Policy


IOM practitioner’s paper: “This paper assesses the ethical risks of using non-traditional data sources to inform migration related policymaking and suggests practical safeguards for various stages during the data cycle. The past decade has witnessed the rapid growth of non-traditional data (social media, mobile phones, satellite data, bank records, etc.) and their use in migration research and policy. While these data sources may be tempting and shed light on main migration trends, ensuring the ethical and responsible use of big data at every stage of migration research and policymaking is complex.

The recognition of the potential of new data sources for migration policy has grown exponentially in recent years. Data innovation is one of the crosscutting priorities of IOM’s Migration Data Strategy.
Further, the UN General Assembly recognises rapid technological developments and their potential in
achieving the Sustainable Development Goals and the Global Compact for Safe, Orderly and Regular Migration highlights the importance of harnessing data innovation to improve data and evidence for informed policies on migration. However, with big data comes big risks. New technological developments have opened new challenges, particularly, concerning data protection, individual privacy, human security,
and fundamental rights. These risks can be greater for certain migrant and displaced groups.
The identified risks are:…(More)” (see also Big Data for Migration Alliance)

Vulnerable People and Data Protection Law


Book by Gianclaudio Malgieri: “Human vulnerability has traditionally been viewed through the lens of specific groups of people, such as ethnic minorities, children, the elderly, or people with disabilities. With the rise of digital media, our perceptions of vulnerable groups and individuals have been reshaped as new vulnerabilities and different vulnerable sub-groups of users, consumers, citizens, and data subjects emerge.

Vulnerable People and Data Protection Law not only depicts these problems but offers the reader a detailed investigation of the concept of data subjects and a reconceptualisation of the notion of vulnerability within the General Data Protection Regulation. The regulation offers a forward-facing set of tools that – though largely underexplored – are essential in rebalancing power asymmetries and mitigating induced vulnerabilities in the age of artificial intelligence.

This book proposes a layered approach to data subject definition. Considering the new potentialities of the digital market, the new awareness about cognitive weaknesses, and the new philosophical sensitivity about vulnerability conditions, the author looks for a more general definition of vulnerability that goes beyond traditional labels. In doing so, he seeks to promote a ‘vulnerability-aware’ interpretation of the GDPR.

A heuristic analysis that re-interprets the whole GDPR, this work is a must-read for both scholars of data protection law and for policymakers looking to strengthen regulations and protect the data of vulnerable individuals…(More)”.

Digitization, Surveillance, Colonialism


Essay by Carissa Veliz: “As I write these words, articles are mushrooming in newspapers and magazines about how privacy is more important than ever after the Supreme Court ruling that has overturned the constitutionality of the right to have an abortion in the United States. In anti-abortion states, browsing histories, text messages, location data, payment data, and information from period-tracking apps can all be used to prosecute both women seeking an abortion and anyone aiding them. The National Right to Life Committee recently published policy recommendations for anti-abortion states that include criminal penalties for people who provide information about self-managed abortions, whether over the phone or online. Women considering an abortion are often in distress, and now they cannot even reach out to friends or family without endangering themselves and others. 

So far, Texas, Oklahoma, and Idaho have passed citizen-enforced abortion bans, according to which anyone can file a civil lawsuit to report an abortion and have the chance of winning at least ten thousand dollars. This is an incredible incentive to use personal data towards for-profit witch-hunting. Anyone can buy personal data from data brokers and fish for suspicious behavior. The surveillance machinery that we have built in the past two decades can now be put to use by authorities and vigilantes to criminalize pregnant women and their doctors, nurses, pharmacists, friends, and family. How productive.

It is not true, however, that the overturning of Roe v. Wade has made privacy more important than ever. Rather, it has provided yet another illustration of why privacy has always been and always will be important. That it is happening in the United States is helpful, because human beings are prone to thinking that whatever happens “over there” — say, in China now, or in East Germany during the Cold War  to those “other people,” doesn’t happen to us — until it does. 

Privacy is important because it protects us from possible abuses of power. As long as human beings are human beings and organizations are organizations, abuses of power will be a constant temptation and threat. That is why it is supremely reckless to build a surveillance architecture. You never know when that data might be used against you — but you can be fairly confident that sooner or later it will be used against you. Collecting personal data might be convenient, but it is also a ticking bomb; it amounts to sensitive material waiting for the chance to turn into an instance of public shaming, extortion, persecution, discrimination, or identity theft. Do you think you have nothing to hide? So did many American women on June 24, only to realize that week that their period was late. You have plenty to hide — you just don’t know what it is yet and whom you should hide it from.

In the digital age, the challenge of protecting privacy is more formidable than most people imagine — but it is nowhere near impossible, and every bit worth putting up a fight for, if you care about democracy or freedom. The challenge is this: the dogma of our time is to turn analog into digital, and as things stand today, digitization is tantamount to surveillance…(More)”.