The Limitations of Privacy Rights


Paper by Daniel J. Solove: “Individual privacy rights are often at the heart of information privacy and data protection laws. The most comprehensive set of rights, from the European Union’s General Data Protection Regulation (GDPR), includes the right to access, right to rectification (correction), right to erasure, right to restriction, right to data portability, right to object, and right to not be subject to automated decisions. Privacy laws around the world include many of these rights in various forms.

In this article, I contend that although rights are an important component of privacy regulation, rights are often asked to do far more work than they are capable of doing. Rights can only give individuals a small amount of power. Ultimately, rights are at most capable of being a supporting actor, a small component of a much larger architecture. I advance three reasons why rights cannot serve as the bulwark of privacy protection. First, rights put too much onus on individuals when many privacy problems are systematic. Second, individuals lack the time and expertise to make difficult decisions about privacy, and rights cannot practically be exercised at scale with the number of organizations than process people’s data. Third, privacy cannot be protected by focusing solely on the atomistic individual. The personal data of many people is interrelated, and people’s decisions about their own data have implications for the privacy of other people.

The main goal of providing privacy rights aims to provide individuals with control over their personal data. However, effective privacy protection involves not just facilitating individual control, but also bringing the collection, processing, and transfer of personal data under control. Privacy rights are not designed to achieve the latter goal; and they fail at the former goal.

After discussing these overarching reasons why rights are insufficient for the oversized role they currently play in privacy regulation, I discuss the common privacy rights and why each falls short of providing significant privacy protection. For each right, I propose broader structural measures that can achieve its underlying goals in a more systematic, rigorous, and less haphazard way…(More)”.

The GDPR effect: How data privacy regulation shaped firm performance globally


Paper by Carl Benedikt Frey and Giorgio Presidente:  “…To measure companies’ exposure to GDPR, we exploit international input-output tables and compute the shares of output sold to EU markets for each country and 2-digit industry. We then construct a shift-share instrument interacting this share with a dummy variable taking the value one from 2018 onwards.

Based on this approach, we find both channels discussed above to be quantitatively important, though the cost channel consistently dominates. On average, across our full sample, companies targeting EU markets saw an 8% reduction in profits and a relatively modest 2% decrease in sales (Figure 1). This suggests that earlier studies, which have focused on online outcomes or proxies of sales, provide an incomplete picture since companies have primarily been adversely affected through surging compliance costs. 

While systematic data on firms’ IT purchases are hard to come by, we can explore how companies developing digital technologies have responded to GDPR. Indeed, taking a closer look at some recent patent documents, we note that these include applications for technologies like a “system and method for providing general data protection regulation (GDPR) compliant hashing in blockchain ledgers”, which guarantees a user’s right to be forgotten. Another example is a ‘Data Consent Manager’, a computer-implemented method for managing consent for sharing data….

While the results reported above show that GDPR has reduced firm performance on average, they do not reveal how different types of firms have been affected. As is well-known, large companies have more technical and financial resources to comply with regulations (Brill 2011), invest more in lobbying (Bombardini 2008), and might be better placed to obtain consent for personal data processing from individual consumers (Goldfarb and Tucker 2011). For example, Facebook has reportedly hired some 1,000 engineers, managers, and lawyers globally in response to the new regulation. It also doubled its EU lobbying budget in 2017 on the previous year, when GDPR was announced. Indeed, according to LobbyFacts.eu, Google, Facebook and Apple now rank among the five biggest corporate spenders on lobbying in the EU, with annual budgets in excess of €3.5 million.

While these are significant costs that might reduce profits, the impact of the GDPR on the fortunes of big tech is ambiguous. As The New York Times writes, “Whether Europe’s tough approach is actually crimping the global tech giants is unclear… Amazon, Apple, Google and Facebook have continued to grow and add customers”. Indeed, by being better able to cope with the burdens of the regulation, these companies may have increased their market share at the expense of smaller companies (Johnson et al. 2020, Peukert et al. 2020). …(More)”.

Privacy As/And Civil Rights


Paper by Tiffany C. Li: “Decades have passed since the modern American civil rights movement began, but the fight for equality is far from over. Systemic racism, sexism, and discrimination against many marginalized groups is still rampant in our society. Tensions rose to a fever pitch in 2020, with a summer of Black Lives Matters protests, sparked by the police killing of George Floyd, leading in to an attempted armed insurrection and attack on the U.S. Capitol on January 6, 2021. Asian-Americans faced rising rates of racism and hate crimes , spurred in part by inflammatory statements from the then-sitting President of the United States. Members of the LGBT community faced attacks on their civil rights during the Trump administration, including a rolling back of protections awarded to transgender individuals.

At the same time, the world faced a deadly pandemic that exposed the inequalities tearing the fabric of our society. The battle for civil rights is clearly not over, and the nation and the world have faced setbacks in the fight for equality, brought out by the pandemic, political pressures, and other factors. Meanwhile, the role of technology is also changing, with new technologies like facial recognition, artificial intelligence, and connected devices, offering new threats and perhaps new hope for civil rights. To understand privacy at our current point in time, we must consider the role of privacy in civil rights—and even, as scholars like Alvaro Bedoya have suggested, privacy itself as a civil right.

This Article is an attempt to expand upon the work of privacy and civil rights scholars in conceptualizing privacy as a civil right and situating this concept within the broader field of privacy studies. This Article builds on the work of scholars who have analyzed critical dimensions of privacy and privacy law, and who have advocated for changes in privacy law that can move our society forward to protect privacy and equality for all…(More)”.

The New Rules of Data Privacy


Essay by Hossein Rahnama and Alex “Sandy” Pentland: “The data harvested from our personal devices, along with our trail of electronic transactions and data from other sources, now provides the foundation for some of the world’s largest companies. Personal data also the wellspring for millions of small businesses and countless startups, which turn it into customer insights, market predictions, and personalized digital services. For the past two decades, the commercial use of personal data has grown in wild-west fashion. But now, because of consumer mistrust, government action, and competition for customers, those days are quickly coming to an end.

For most of its existence, the data economy was structured around a “digital curtain” designed to obscure the industry’s practices from lawmakers and the public. Data was considered company property and a proprietary secret, even though the data originated from customers’ private behavior. That curtain has since been lifted and a convergence of consumer, government, and market forces are now giving users more control over the data they generate. Instead of serving as a resource that can be freely harvested, countries in every region of the world have begun to treat personal data as an asset owned by individuals and held in trust by firms.

This will be a far better organizing principle for the data economy. Giving individuals more control has the potential to curtail the sector’s worst excesses while generating a new wave of customer-driven innovation, as customers begin to express what sort of personalization and opportunity they want their data to enable. And while Adtech firms in particular will be hardest hit, any firm with substantial troves of customer data will have to make sweeping changes to its practices, particularly large firms such as financial institutions, healthcare firms, utilities, and major manufacturers and retailers.

Leading firms are already adapting to the new reality as it unfolds. The key to this transition — based upon our research on data and trust, and our experience working on this issue with a wide variety of firms — is for companies to reorganize their data operations around the new fundamental rules of consent, insight, and flow…(More)”.

Privacy and/or Trade


Paper by Anupam Chander and Paul M. Schwartz: “International privacy and trade law developed together, but now are engaged in significant conflict. Current efforts to reconcile the two are likely to fail, and the result for globalization favors the largest international companies able to navigate the regulatory thicket. In a landmark finding, this Article shows that more than sixty countries outside the European Union are now evaluating whether foreign countries have privacy laws that are adequate to receive personal data. This core test for deciding on the permissibility of global data exchanges is currently applied in a nonuniform fashion with ominous results for the data flows that power trade today.

The promise of a global internet, with access for all, including companies from the Global South, is increasingly remote. This Article uncovers the forgotten and fateful history of the international regulation of privacy and trade that led to our current crisis and evaluates possible solutions to the current conflict. It proposes a Global Agreement on Privacy enforced within the trade order, but with external data privacy experts developing the treaty’s substantive norms….(More)”.

How privacy’s past may shape its future


Essay by Alessandro Acquisti, Laura Brandimarte and Jeff Hancock: “Continued expansion of human activities into digital realms gives rise to concerns about digital privacy and its invasions, often expressed in terms of data rights and internet surveillance. It may thus be tempting to construe privacy as a modern phenomenon—something our ancestors lacked and technological innovation and urban growth made possible. Research from history, anthropology, and ethnography suggests otherwise. The evidence for peoples seeking to manage the boundaries of private and public spans time and space, social class, and degree of technological sophistication. Privacy—not merely hiding of data, but the selective opening and closing of the self to others—appears to be both culturally specific and culturally universal. But what could explain the simultaneous universality and diversity of a human drive for privacy? An account of the evolutionary roots of privacy may offer an answer and teach us about privacy’s digital future and how to manage it….(More)”.

Oversight Board publishes policy advisory opinion on the sharing of private residential information


Press Release by Oversight Board: “Last year, Meta requested a policy advisory opinion from the Board on the sharing of private residential addresses and images, and the contexts in which this information may be published on Facebook and Instagram. Meta considers this to be a difficult question as while access to such information can be relevant to journalism and civic activism, “exposing this information without consent can create a risk to residents’ safety and infringe on an individual’s privacy.”

Meta’s request noted several potential harms linked to releasing personal information, including residential addresses and images. These include “doxing,” (which refers to the release of documents, abbreviated as “dox”) where information which can identify someone is revealed online. Meta noted that doxing can have negative real-world consequences, such as harassment or stalking…

The Board understands that the sharing of private residential addresses and images represents a potentially serious violation of the right to privacy both for people who use Facebook and Instagram, and those who do not.

Once this information is shared, the harms that can result, such as doxing, are difficult to remedy. Harms resulting from doxing disproportionately affect groups such as women, children and LGBTQIA+ people, and can include emotional distress, loss of employment and even physical harm or death.

As the potential for harm is particularly context specific, it is challenging to develop objective and universal indicators that would allow content reviewers to distinguish the sharing of content that would be harmful from shares that would not be. That is why the Board believes that the Privacy Violations policy should be more protective of privacy.

International human rights standards permit necessary and proportionate restrictions on expression to protect people’s right to privacy. As such, the Board favors narrowing the exceptions to the Privacy Violations policy to help Meta better protect the private residential information of people both on and off its platforms.

In exchanges with the Board, Meta stressed that “ensuring that the “publicly available” definition does not exempt content from removal that poses a risk of offline harm” is a “persistent concern.” Public records and other sources of what could be considered “publicly available” information still require resources and effort to be accessed by the general public. On social media, however, such information may be shared and accessed more quickly, and on a much bigger scale, which significantly increases the risk of harm. As such, the Board proposes removing the “publicly available” exception for the sharing of both private residential addresses and images that meet certain criteria….(More)”.

Suicide hotline shares data with for-profit spinoff, raising ethical questions


Alexandra Levine at Politico: “Crisis Text Line is one of the world’s most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.

But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization’s for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.

Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly “anonymized,” stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris’ case, by making “customer support more human, empathetic, and scalable.”

In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive…(More).”

The UN is testing technology that processes data confidentially


The Economist: “Reasons of confidentiality mean that many medical, financial, educational and other personal records, from the analysis of which much public good could be derived, are in practice unavailable. A lot of commercial data are similarly sequestered. For example, firms have more granular and timely information on the economy than governments can obtain from surveys. But such intelligence would be useful to rivals. If companies could be certain it would remain secret, they might be more willing to make it available to officialdom.

A range of novel data-processing techniques might make such sharing possible. These so-called privacy-enhancing technologies (PETs) are still in the early stages of development. But they are about to get a boost from a project launched by the United Nations’ statistics division. The UN PETs Lab, which opened for business officially on January 25th, enables national statistics offices, academic researchers and companies to collaborate to carry out projects which will test various PETs, permitting technical and administrative hiccups to be identified and overcome.

The first such effort, which actually began last summer, before the PETs Lab’s formal inauguration, analysed import and export data from national statistical offices in America, Britain, Canada, Italy and the Netherlands, to look for anomalies. Those could be a result of fraud, of faulty record keeping or of innocuous re-exporting.

For the pilot scheme, the researchers used categories already in the public domain—in this case international trade in things such as wood pulp and clocks. They thus hoped to show that the system would work, before applying it to information where confidentiality matters.

They put several kinds of PETs through their paces. In one trial, OpenMined, a charity based in Oxford, tested a technique called secure multiparty computation (SMPC). This approach involves the data to be analysed being encrypted by their keeper and staying on the premises. The organisation running the analysis (in this case OpenMined) sends its algorithm to the keeper, who runs it on the encrypted data. That is mathematically complex, but possible. The findings are then sent back to the original inquirer…(More)”.

Why Privacy Matters


Book by Neil Richards: “Many people tell us that privacy is dead, or that it is dying, but such talk is a dangerous fallacy. This book explains what privacy is, what privacy isn’t, and why privacy matters. Privacy is the extent to which human information is known or used, and it is fundamentally about the social power that human information provides over other people. The best way to ensure that power is checked and channeled in ways that benefit humans and their society is through rules—rules about human information. And because human information rules of some sort are inevitable, we should craft our privacy rules to promote human values. The book suggests three such values that our human information rules should promote: identity, freedom, and protection. Identity allows us to be thinking, self-defining humans; freedom lets us be citizens; while protection safeguards our roles as situated consumers and workers, allowing us, as members of society, to trust and rely on other people so that we can live our lives and hopefully build a better future together…(More)”.