The European Union-U.S. Data Privacy Framework


White House Fact Sheet: “Today, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.) directing the steps that the United States will take to implement the U.S. commitments under the European Union-U.S. Data Privacy Framework (EU-U.S. DPF) announced by President Biden and European Commission President von der Leyen in March of 2022. 

Transatlantic data flows are critical to enabling the $7.1 trillion EU-U.S. economic relationship.  The EU-U.S. DPF will restore an important legal basis for transatlantic data flows by addressing concerns that the Court of Justice of the European Union raised in striking down the prior EU-U.S. Privacy Shield framework as a valid data transfer mechanism under EU law. 

The Executive Order bolsters an already rigorous array of privacy and civil liberties safeguards for U.S. signals intelligence activities. It also creates an independent and binding mechanism enabling individuals in qualifying states and regional economic integration organizations, as designated under the E.O., to seek redress if they believe their personal data was collected through U.S. signals intelligence in a manner that violated applicable U.S. law.

U.S. and EU companies large and small across all sectors of the economy rely upon cross-border data flows to participate in the digital economy and expand economic opportunities. The EU-U.S. DPF represents the culmination of a joint effort by the United States and the European Commission to restore trust and stability to transatlantic data flows and reflects the strength of the enduring EU-U.S. relationship based on our shared values…(More)”.

Lawless Surveillance


Paper by Barry Friedman: “Here in the United States, policing agencies are engaging in mass collection of personal data, building a vast architecture of surveillance. License plate readers collect our location information. Mobile forensics data terminals suck in the contents of cell phones during traffic stops. CCTV maps our movements. Cheap storage means most of this is kept for long periods of time—sometimes into perpetuity. Artificial intelligence makes searching and mining the data a snap. For most of us whose data is collected, stored, and mined, there is no suspicion whatsoever of wrongdoing.

This growing network of surveillance is almost entirely unregulated. It is, in short, lawless. The Fourth Amendment touches almost none of it, either because what is captured occurs in public, and so is supposedly “knowingly exposed,” or because of doctrine that shields information collected from third parties. It is unregulated by statutes because legislative bodies—when they even know about these surveillance systems—see little profit in taking on the police.

In the face of growing concern over such surveillance, this Article argues there is a constitutional solution sitting in plain view. In virtually every other instance in which personal information is collected by the government, courts require that a sound regulatory scheme be in place before information collection occurs. The rulings on the mandatory nature of regulation are remarkably similar, no matter under which clause of the Constitution collection is challenged.

This Article excavates this enormous body of precedent and applies it to the problem of government mass data collection. It argues that before the government can engage in such surveillance, there must be a regulatory scheme in place. And by changing the default rule from allowing police to collect absent legislative prohibition, to banning collection until there is legislative action, legislatures will be compelled to act (or there will be no surveillance). The Article defines what a minimally-acceptable regulatory scheme for mass data collection must include, and shows how it can be grounded in the Constitution…(More)”.

California Governor Signs Sweeping Children’s Online Safety Bill


Article by Natasha Singer: “California will adopt a broad new approach to protecting children online after Gov. Gavin Newsom signed a bill on Thursday that could transform how many social networks, games and other services treat minors.

Despite opposition from the tech industry, the State Legislature unanimously approved the bill at the end of August. It is the first state statute in the nation requiring online services likely to be used by youngsters to install wide-ranging safeguards for users under 18.

Among other things, the measure will require sites and apps to curb the risks that certain popular features — like allowing strangers to message one another — may pose to younger users. It will also require online services to turn on the highest privacy settings by default for children.

“We’re taking aggressive action in California to protect the health and well-being of our kids,” Governor Newsom said in a statement that heralded the new law as “bipartisan landmark legislation” aimed at protecting the well-being, data and privacy of children.

Called the California Age-Appropriate Design Code Act, the new legislation compels online services to take a proactive approach to safety — by designing their products and features from the outset with the “best interests” of young users in mind.

The California measure could apply to a wide range of popular digital products that people under 18 are likely to use: social networks, game platforms, connected toys, voice assistants and digital learning tools for schools. It could also affect children far beyond the state, prompting some services to introduce changes nationwide, rather than treat minors in California differently…(More)”.

Digital Privacy for Reproductive Choice in the Post-Roe Era


Paper by Aziz Z. Huq and Rebecca Wexler: “The overruling of Roe v. Wade unleashed a torrent of regulatory and punitive activity restricting lawful reproductive options. The turn to the expansive criminal law and new schemes of civil liability creates new, and quite different, concerns from the pre-Roe landscape a half-century, ago. Reproductive choice, and its nemesis, rests on information. For pregnant people, deciding on a choice of medical care entails a search for advice and services. Information is at a premium for them. Meanwhile, efforts to regulate abortion begin with clinic closings, but quickly will extend to civil actions and criminal indictments of patients, providers, and those who facilitate abortions. Like the pregnant themselves, criminal and civil enforcers depend on information. And in the contemporary context, the informational landscape, and hence access to counseling and services such as medication abortion, is largely digital. In an era when most people use search engines or social media to access information, the digital architecture and data retention policies of those platforms will determine not only whether the pregnant can access medically accurate advice but also whether the mere act of doing so places them in legal peril.

This Article offers the first comprehensive accounting of abortion-related digital privacy after the end of Roe. It demonstrates first that digital privacy for pregnant persons in the United States has suddenly become a tremendously fraught and complex question. It then maps the treacherous social, legal and economic terrain upon which firms, individuals, and states will make privacy related decisions. Building on this political economy, we develop a moral and economic argument to the effect that digital firms should maximize digital privacy for pregnant persons within the scope of the law, and should actively resist restrictionist states’ efforts to instrumentalize them into their war on reproductive choice. We then lay out precise, tangible steps that firms should take to enact this active resistance, explaining in particular a range of powerful yet legal options for firms to refuse cooperation with restrictionist criminal and civil investigations. Finally, we present an original, concrete and immediately actionable proposal for federal and state legislative intervention: a statutory evidentiary privilege to shield abortion-relevant data from restrictionist warrants, subpoenas, court orders, and judicial proceedings…(More)”

Can Privacy Nudges be Tailored to Individuals’ Decision Making and Personality Traits?


Paper by Logan Warberg, Alessandro Acquisti and Douglas Sicker: “While the effectiveness of nudges in influencing user behavior has been documented within the literature, most prior work in the privacy field has focused on ‘one-size-fits-all’ interventions. Recent behavioral research has identified the potential of tailoring nudges to users by leveraging individual differences in decision making and personality. We present the results of three online experiments aimed at investigating whether nudges tailored to various psychometric scales can influence participants’ disclosure choices. Each study adopted a difference-in-differences design, testing whether differences in disclosure rates for participants presented with a nudge were affected by differences along various psychometric variables. Study 1 used a hypothetical disclosure scenario to measure participants’ responses to a single nudge. Study 2 and its replication (Study 3) tested responses in real disclosure scenarios to two nudges. Across all studies, we failed to find significant effects robustly linking any of the measured psychometric variables to differences in disclosure rates. We describe our study design and results along with a discussion of the practicality of using decision making and personality traits to tailor privacy nudges…(More)”.

A South African City Says It’s Putting QR Codes On Informal Settlement Cabins To Help Services. But Residents And Privacy Experts Are Uncertain.


Article by Ray Mwareya: “Cape Town, South Africa’s second wealthiest city, is piloting a new plan for the 146,000 households in its informal settlements: QR-coding their homes.

City officials say the plan is to help residents get access to government services like welfare and provide an alternative to a formal street address so they can more easily get packages delivered or hail a taxi. But privacy experts warn that the city isn’t being clear about how the data will be stored or used, and the digital identification of poor Black residents could lead to retreading Cape Town’s ugly history of discrimination.

Cape Town’s government says it has marked 1,000 cabins in unofficial settlements with QR codes and made sure every individual’s information is checked, vetted, and saved by its corporate geographic information system.

Cape Town, South Africa’s second wealthiest city, is piloting a new plan for the 146,000 households in its informal settlements: QR-coding their homes.

City officials say the plan is to help residents get access to government services like welfare and provide an alternative to a formal street address so they can more easily get packages delivered or hail a taxi. But privacy experts warn that the city isn’t being clear about how the data will be stored or used, and the digital identification of poor Black residents could lead to retreading Cape Town’s ugly history of discrimination.

Cape Town’s government says it has marked 1,000 cabins in unofficial settlements with QR codes and made sure every individual’s information is checked, vetted, and saved by its corporate geographic information system…(More)”.

EU Court Expands Definition of Sensitive Data, Prompting Legal Concerns for Companies


Article by Catherine Stupp: “Companies will be under increased pressure after Europe’s top court ruled they must apply special protections to data that firms previously didn’t consider sensitive.

Under the European Union’s General Data Protection Regulation, information about health, religion, political views and sexual orientation are considered sensitive. Companies generally aren’t allowed to process it unless they apply special safeguards.

The European Court of Justice on Aug. 1 determined that public officials in Lithuania had their sensitive data revealed because their spouses’ names were published online, which could indicate their sexual orientation. Experts say the implications will extend to other types of potentially sensitive information.

Data that might be used to infer a sensitive piece of information about a person is also sensitive, the court said. That could include unstructured data—which isn’t organized in databases and is therefore more difficult to search through and analyze—such as surveillance camera footage in a hospital that indicates a person was treated there, legal experts say. Records of a special airplane meal might reveal religious views.

The court ruling “raises a lot of practical complexities and a lot of difficulty in understanding if the data [organizations] have is sensitive or not,” said Dr. Gabriela Zanfir-Fortuna, vice president for global privacy at the Future of Privacy Forum, a think tank based in Washington, D.C.

Many companies with large data sets may not know they hold details that indirectly relate to sensitive information, privacy experts say. Identifying where that data is and deciding whether it could reveal personal details about an individual would be a huge undertaking, said Tobias Judin, head of the international section at the Norwegian data protection regulator.

“You can’t really comply with the law if your data set becomes so big that you don’t really know what’s in it,” Mr. Judin said.

The GDPR says companies can only process sensitive data in a few circumstances, such as if a person gives explicit consent for it to be used for a specified purpose.

Regulators have been grappling with the question of how to determine what is sensitive data. The Norwegian regulator last year fined gay-dating app Grindr LLC 65 million kroner, equivalent to roughly $6.7 million The regulator said the user data was sensitive because use of the app indicated their sexual orientation.

Grindr said it doesn’t require users to share that data. The company appealed in February. Mr. Judin said his office is reviewing material submitted by the company as part of its appeal. Spain’s regulator came to a different conclusion in January, and found that data Grindr shared for advertising purposes wasn’t sensitive….(More)”.

Legislating Data Loyalty


Paper by Woodrow Hartzog and NNeil M. Richards: “eil M. RichardsLawmakers looking to embolden privacy law have begun to consider imposing duties of loyalty on organizations trusted with people’s data and online experiences. The idea behind loyalty is simple: organizations should not process data or design technologies that conflict with the best interests of trusting parties. But the logistics and implementation of data loyalty need to be developed if the concept is going to be capable of moving privacy law beyond its “notice and consent” roots to confront people’s vulnerabilities in their relationship with powerful data collectors.

In this short Essay, we propose a model for legislating data loyalty. Our model takes advantage of loyalty’s strengths—it is well-established in our law, it is flexible, and it can accommodate conflicting values. Our Essay also explains how data loyalty can embolden our existing data privacy rules, address emergent dangers, solve privacy’s problems around consent and harm, and establish an antibetrayal ethos as America’s privacy identity.

We propose that lawmakers use a two-step process to (1) articulate a primary, general duty of loyalty, then (2) articulate “subsidiary” duties that are more specific and sensitive to context. Subsidiary duties regarding collection, personalization, gatekeeping, persuasion, and mediation would target the most opportunistic contexts for self-dealing and result in flexible open-ended duties combined with highly specific rules. In this way, a duty of data loyalty is not just appealing in theory—it can be effectively implemented in practice just like the other duties of loyalty our law has recognized for hundreds of years. Loyalty is thus not only flexible, but it is capable of breathing life into America’s historically tepid privacy frameworks…(More)”.

The Privacy Elasticity of Behavior: Conceptualization and Application


Paper by Inbal Dekel, Rachel Cummings, Ori Heffetz & Katrina Ligett: “We propose and initiate the study of privacy elasticity—the responsiveness of economic variables to small changes in the level of privacy given to participants in an economic system. Individuals rarely experience either full privacy or a complete lack of privacy; we propose to use differential privacy—a computer-science theory increasingly adopted by industry and government—as a standardized means of quantifying continuous privacy changes. The resulting privacy measure implies a privacy-elasticity notion that is portable and comparable across contexts. We demonstrate the feasibility of this approach by estimating the privacy elasticity of public-good contributions in a lab experiment…(More)”.

Roe’s overturn is tech’s privacy apocalypse


Scott Rosenberg at Axios: “America’s new abortion reality is turning tech firms’ data practices into an active field of conflict — a fight that privacy advocates have long predicted and company leaders have long feared.

Why it matters: A long legal siege in which abortion-banning states battle tech companies, abortion-friendly states and their own citizens to gather criminal evidence is now a near certainty.

  • The once-abstract privacy argument among policy experts has transformed overnight into a concrete real-world problem, superheated by partisan anger, affecting vast swaths of the U.S. population, with tangible and easily understood consequences.

Driving the news: Google announced Friday a new program to automatically delete the location data of users who visit “particularly personal” locations like “counseling centers, domestic violence shelters, abortion clinics, fertility centers, addiction treatment facilities, weight loss clinics, cosmetic surgery clinics, and others.”

  • Google tracks the location of any user who turns on its “location services” — a choice that’s required to make many of its programs, like Google Search and Maps, more useful.
  • That tracking happens even when you’re logged into non-location-related Google services like YouTube, since Google long ago unified all its accounts.

Between the lines: Google’s move won cautious applause but left plenty of open concerns.

  • It’s not clear how, and how reliably, Google will identify the locations that trigger automatic data deletion.
  • The company will not delete search requests automatically — users who want to protect themselves will have to do so themselves.
  • A sudden gap in location data could itself be used as evidence in court…(More)”.