A Report from the Annenberg School for Communication: “Consent has always been a central part of Americans’ interactions with the commercial internet. Federal and state laws, as well as decisions from the Federal Trade Commission (FTC), require either implicit (“opt out”) or explicit (“opt in”) permission from individuals for companies to take and use data about them. Genuine opt out and opt in consent requires that people have knowledge about commercial data-extraction practices as well as a belief they can do something about them. As we approach the 30th anniversary of the commercial internet, the latest Annenberg national survey finds that Americans have neither. High percentages of Americans don’t know, admit they don’t know, and believe they can’t do anything about basic practices and policies around companies’ use of people’s data…
High levels of frustration, concern, and fear compound Americans’ confusion: 80% say they have little control over how marketers can learn about them online; 80% agree that what companies know about them from their online behaviors can harm them. These and related discoveries from our survey paint a picture of an unschooled and admittedly incapable society that rejects the internet industry’s insistence that people will accept tradeoffs for benefits and despairs of its inability to predictably control its digital life in the face of powerful corporate forces. At a time when individual consent lies at the core of key legal frameworks governing the collection and use of personal information, our findings describe an environment where genuine consent may not be possible….The aim of this report is to chart the particulars of Americans’ lack of knowledge about the commercial use of their data and their “dark resignation” in connection to it. Our goal is also to raise questions and suggest solutions about public policies that allow companies to gather, analyze, trade, and otherwise benefit from information they extract from large populations of people who are uninformed about how that information will be used and deeply concerned about the consequences of its use. In short, we find that informed consent at scale is a myth, and we urge policymakers to act with that in mind.”…(More)”.
‘Neurorights’ and the next flashpoint of medical privacy
Article by Alex LaCasse: “Around the world, leading neuroscientists, neuroethicists, privacy advocates and legal minds are taking greater interest in brain data and its potential.
Opinions vary widely on the long-term advancements in technology designed to measure brain activity and their impacts on society, as new products trickle out of clinical settings and gain traction for commercial applications.
Some say alarm bells should already be sounding and argue the technology could have corrosive effects on democratic society. Others counter such claims are hyperbolic, given the uncertainty that technology can even measure certain brain activities in the purported way.
Today, neurotechnology is primarily confined to medical and research settings, with the use of various clinical-grade devices to monitor the brain activity of patients who may suffer from mental illnesses or paralysis to gauge muscle movement and record electroencephalography (the measurement of electrical activity and motor function in the brain)….
“I intentionally don’t call this neurorights or brain rights. I call it cognitive liberty,” Duke University Law and Philosophy Professor Nita Farahany said during a LinkedIn Live session. “There is promise of this technology, not only for people who are struggling with a loss of speech and loss of motor activity, but for everyday people.”
The jumping-off point of the panel centered around Farahany’s new book, “The Battle for Your Brain: The Ability to Think Freely in the Age of Neurotechnology,” which examines the neurotechnology landscape and potential negative outcomes without regulatory oversight.
Farahany was motivated to write the book because she saw a “chasm” between what she thought neurotechnology was capable of and the reality of some companies working to one day decode people’s inner thoughts on some level…(More)” (Book)”.
Privacy Decisions are not Private: How the Notice and Choice Regime Induces us to Ignore Collective Privacy Risks and what Regulation should do about it
Paper by Christopher Jon Sprigman and Stephan Tontrup: “For many reasons the current notice and choice privacy framework fails to empower individuals in effectively making their own privacy choices. In this Article we offer evidence from three novel experiments showing that at the core of this failure is a cognitive error. Notice and choice caters to a heuristic that people employ to make privacy decisions. This heuristic is meant to judge trustworthiness in face-to-face-situations. In the online context, it distorts privacy decision-making and leaves potential disclosers vulnerable to exploitation.
From our experimental evidence exploring the heuristic’s effect, we conclude that privacy law must become more behaviorally aware. Specifically, privacy law must be redesigned to intervene in the cognitive mechanisms that keep individuals from making better privacy decisions. A behaviorally-aware privacy regime must centralize, standardize and simplify the framework for making privacy choices.
To achieve these goals, we propose a master privacy template which requires consumers to define their privacy preferences in advance—doing so avoids presenting the consumer with a concrete counterparty, and this, in turn, prevents them from applying the trust heuristic and reduces many other biases that affect privacy decision-making. Our data show that blocking the heuristic enables consumers to consider relevant privacy cues and be considerate of externalities their privacy decisions cause.
The master privacy template provides a much more effective platform for regulation. Through the master template the regulator can set the standard for automated communication between user clients and website interfaces, a facility which we expect to enhance enforcement and competition about privacy terms…(More)”.
Data and the Digital Self
Report by the ACS: “A series of essays by some of the leading minds on data sharing and privacy in Australia, this book takes a look at some of the critical data-related issues facing Australia today and tomorrow. It looks at digital identity and privacy in the 21st century; at privacy laws and what they need to look like to be effective in the era of big data; at how businesses and governments can work better to build trust in this new era; and at how we need to look beyond just privacy and personal information as we develop solutions over the coming decades…(More)”.
“How Dare They Peep into My Private Life”
Report by Human Rights Watch on “Children’s Rights Violations by Governments that Endorsed Online Learning During the Covid-19 Pandemic”: “The coronavirus pandemic upended the lives and learning of children around the world. Most countries pivoted to some form of online learning, replacing physical classrooms with EdTech websites and apps; this helped fill urgent gaps in delivering some form of education to many children.
But in their rush to connect children to virtual classrooms, few governments checked whether the EdTech they were rapidly endorsing or procuring for schools were safe for children. As a result, children whose families were able to afford access to the internet and connected devices, or who made hard sacrifices in order to do so, were exposed to the privacy practices of the EdTech products they were told or required to use during Covid-19 school closures.
Human Rights Watch conducted its technical analysis of the products between March and August 2021, and subsequently verified its findings as detailed in the methodology section. Each analysis essentially took a snapshot of the prevalence and frequency of tracking technologies embedded in each product on a given date in that window. That prevalence and frequency may fluctuate over time based on multiple factors, meaning that an analysis conducted on later dates might observe variations in the behavior of the products…(More)”
The Economics of Digital Privacy
Paper by Avi Goldfarb & Verina F. Que: “There has been increasing attention to privacy in the media and in regulatory discussions. This is a consequence of the increased usefulness of digital data. The literature has emphasized the benefits and costs of digital data flows to consumers and firms. The benefits arise in the form of data-driven innovation, higher quality products and services that match consumer needs, and increased profits. The costs relate to intrinsic and instrumental values of privacy. Under standard economic assumptions, this framing of a cost-benefit tradeoff might suggest little role for regulation beyond ensuring consumers are appropriately informed in a robust competitive environment. The empirical literature thus far has focused on this direct cost-benefit assessment, examining how privacy regulations have affected various market outcomes. However, an increasing body of theory work emphasizes externalities related to data flows. These externalities, both positive and negative, suggest benefits to the targeted regulation of digital privacy…(More)”.
UN Guide on Privacy-Enhancing Technologies for Official Statistics
UN Guide: “This document presents methodologies and approaches to mitigating privacy risks when using sensitive or confidential data, which are collectively referred to as privacy-enhancing technologies (PETs). National Statistics Offices (NSOs) are entrusted with data that has the potential to drive innovation and improve national services, research, and social benefit. Yet, there has been a rise in sustained cyber threats, complex networks of intermediaries motivated to procure sensitive data, and advances in methods to re-identify and link data to individuals and across multiple data sources. Data breaches erode public trust and can have serious negative consequences for individuals, groups, and communities. This document focuses on PETs that protect data during analysis and dissemination of sensitive information so that the benefits of using data for official statistics can be realized while minimizing privacy risks to those entrusting sensitive data to NSOs…(More)”.
Privacy
Book edited by Carissa Veliz and Steven M. Cahn: “Companies collect and share much of your daily life, from your location and search history, to your likes, habits, and relationships. As more and more of our personal data is collected, analyzed, and distributed, we need to think carefully about what we might be losing when we give up our privacy.
Privacy is a thought-provoking collection of philosophical essays on privacy, offering deep insights into the nature of privacy, its value, and the consequences of its loss. Bringing together both classic and contemporary work, this timely volume explores the theories, issues, debates, and applications of the philosophical study of privacy. The essays address concealment and exposure, the liberal value of privacy, privacy in social media, privacy rights and public information, privacy and the limits of law, and more…(More)”.
Americans Don’t Understand What Companies Can Do With Their Personal Data — and That’s a Problem
Press Release by the Annenberg School for Communications: “Have you ever had the experience of browsing for an item online, only to then see ads for it everywhere? Or watching a TV program, and suddenly your phone shows you an ad related to the topic? Marketers clearly know a lot about us, but the extent of what they know, how they know it, and what they’re legally allowed to know can feel awfully murky.
In a new report, “Americans Can’t Consent to Companies’ Use of Their Data,” researchers asked a nationally representative group of more than 2,000 Americans to answer a set of questions about digital marketing policies and how companies can and should use their personal data. Their aim was to determine if current “informed consent” practices are working online.
They found that the great majority of Americans don’t understand the fundamentals of internet marketing practices and policies, and that many feel incapable of consenting to how companies use their data. As a result, the researchers say, Americans can’t truly give informed consent to digital data collection.
The survey revealed that 56% of American adults don’t understand the term “privacy policy,” often believing it means that a company won’t share their data with third parties without permission. In actual fact, many of these policies state that a company can share or sell any data it gathers about site visitors with other websites or companies.
Perhaps because so many Americans feel that internet privacy feels impossible to comprehend — with “opting-out” or “opting-in,” biometrics, and VPNs — they don’t trust what is being done with their digital data. Eighty percent of Americans believe that what companies know about them can cause them harm.
“People don’t feel that they have the ability to protect their data online — even if they want to,” says lead researcher Joseph Turow, Robert Lewis Shayon Professor of Media Systems & Industries at the Annenberg School for Communication at the University of Pennsylvania….(More)”
Engineering Personal Data Sharing
Report by ENISA: “This report attempts to look closer at specific use cases relating to personal data sharing, primarily in the health sector, and discusses how specific technologies and considerations of implementation can support the meeting of specific data protection. After discussing some challenges in (personal) data sharing, this report demonstrates how to engineer specific technologies and techniques in order to enable privacy preserving data sharing. More specifically it discusses specific use cases for sharing data in the health sector, with the aim of demonstrating how data protection principles can be met through the proper use of technological solutions relying on advanced cryptographic techniques. Next it discusses data sharing that takes place as part of another process or service, where the data is processed through some secondary channel or entity before reaching its primary recipient. Lastly, it identifies challenges, considerations and possible architectural solutions on intervenability aspects (such as the right to erasure and the right to rectification when sharing data)…(More)”.