Ethical Considerations in Re-Using Private Sector Data for Migration-Related Policy


IOM practitioner’s paper: “This paper assesses the ethical risks of using non-traditional data sources to inform migration related policymaking and suggests practical safeguards for various stages during the data cycle. The past decade has witnessed the rapid growth of non-traditional data (social media, mobile phones, satellite data, bank records, etc.) and their use in migration research and policy. While these data sources may be tempting and shed light on main migration trends, ensuring the ethical and responsible use of big data at every stage of migration research and policymaking is complex.

The recognition of the potential of new data sources for migration policy has grown exponentially in recent years. Data innovation is one of the crosscutting priorities of IOM’s Migration Data Strategy.
Further, the UN General Assembly recognises rapid technological developments and their potential in
achieving the Sustainable Development Goals and the Global Compact for Safe, Orderly and Regular Migration highlights the importance of harnessing data innovation to improve data and evidence for informed policies on migration. However, with big data comes big risks. New technological developments have opened new challenges, particularly, concerning data protection, individual privacy, human security,
and fundamental rights. These risks can be greater for certain migrant and displaced groups.
The identified risks are:…(More)” (see also Big Data for Migration Alliance)

Vulnerable People and Data Protection Law


Book by Gianclaudio Malgieri: “Human vulnerability has traditionally been viewed through the lens of specific groups of people, such as ethnic minorities, children, the elderly, or people with disabilities. With the rise of digital media, our perceptions of vulnerable groups and individuals have been reshaped as new vulnerabilities and different vulnerable sub-groups of users, consumers, citizens, and data subjects emerge.

Vulnerable People and Data Protection Law not only depicts these problems but offers the reader a detailed investigation of the concept of data subjects and a reconceptualisation of the notion of vulnerability within the General Data Protection Regulation. The regulation offers a forward-facing set of tools that – though largely underexplored – are essential in rebalancing power asymmetries and mitigating induced vulnerabilities in the age of artificial intelligence.

This book proposes a layered approach to data subject definition. Considering the new potentialities of the digital market, the new awareness about cognitive weaknesses, and the new philosophical sensitivity about vulnerability conditions, the author looks for a more general definition of vulnerability that goes beyond traditional labels. In doing so, he seeks to promote a ‘vulnerability-aware’ interpretation of the GDPR.

A heuristic analysis that re-interprets the whole GDPR, this work is a must-read for both scholars of data protection law and for policymakers looking to strengthen regulations and protect the data of vulnerable individuals…(More)”.

Digitization, Surveillance, Colonialism


Essay by Carissa Veliz: “As I write these words, articles are mushrooming in newspapers and magazines about how privacy is more important than ever after the Supreme Court ruling that has overturned the constitutionality of the right to have an abortion in the United States. In anti-abortion states, browsing histories, text messages, location data, payment data, and information from period-tracking apps can all be used to prosecute both women seeking an abortion and anyone aiding them. The National Right to Life Committee recently published policy recommendations for anti-abortion states that include criminal penalties for people who provide information about self-managed abortions, whether over the phone or online. Women considering an abortion are often in distress, and now they cannot even reach out to friends or family without endangering themselves and others. 

So far, Texas, Oklahoma, and Idaho have passed citizen-enforced abortion bans, according to which anyone can file a civil lawsuit to report an abortion and have the chance of winning at least ten thousand dollars. This is an incredible incentive to use personal data towards for-profit witch-hunting. Anyone can buy personal data from data brokers and fish for suspicious behavior. The surveillance machinery that we have built in the past two decades can now be put to use by authorities and vigilantes to criminalize pregnant women and their doctors, nurses, pharmacists, friends, and family. How productive.

It is not true, however, that the overturning of Roe v. Wade has made privacy more important than ever. Rather, it has provided yet another illustration of why privacy has always been and always will be important. That it is happening in the United States is helpful, because human beings are prone to thinking that whatever happens “over there” — say, in China now, or in East Germany during the Cold War  to those “other people,” doesn’t happen to us — until it does. 

Privacy is important because it protects us from possible abuses of power. As long as human beings are human beings and organizations are organizations, abuses of power will be a constant temptation and threat. That is why it is supremely reckless to build a surveillance architecture. You never know when that data might be used against you — but you can be fairly confident that sooner or later it will be used against you. Collecting personal data might be convenient, but it is also a ticking bomb; it amounts to sensitive material waiting for the chance to turn into an instance of public shaming, extortion, persecution, discrimination, or identity theft. Do you think you have nothing to hide? So did many American women on June 24, only to realize that week that their period was late. You have plenty to hide — you just don’t know what it is yet and whom you should hide it from.

In the digital age, the challenge of protecting privacy is more formidable than most people imagine — but it is nowhere near impossible, and every bit worth putting up a fight for, if you care about democracy or freedom. The challenge is this: the dogma of our time is to turn analog into digital, and as things stand today, digitization is tantamount to surveillance…(More)”.

The European Union-U.S. Data Privacy Framework


White House Fact Sheet: “Today, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.) directing the steps that the United States will take to implement the U.S. commitments under the European Union-U.S. Data Privacy Framework (EU-U.S. DPF) announced by President Biden and European Commission President von der Leyen in March of 2022. 

Transatlantic data flows are critical to enabling the $7.1 trillion EU-U.S. economic relationship.  The EU-U.S. DPF will restore an important legal basis for transatlantic data flows by addressing concerns that the Court of Justice of the European Union raised in striking down the prior EU-U.S. Privacy Shield framework as a valid data transfer mechanism under EU law. 

The Executive Order bolsters an already rigorous array of privacy and civil liberties safeguards for U.S. signals intelligence activities. It also creates an independent and binding mechanism enabling individuals in qualifying states and regional economic integration organizations, as designated under the E.O., to seek redress if they believe their personal data was collected through U.S. signals intelligence in a manner that violated applicable U.S. law.

U.S. and EU companies large and small across all sectors of the economy rely upon cross-border data flows to participate in the digital economy and expand economic opportunities. The EU-U.S. DPF represents the culmination of a joint effort by the United States and the European Commission to restore trust and stability to transatlantic data flows and reflects the strength of the enduring EU-U.S. relationship based on our shared values…(More)”.

Lawless Surveillance


Paper by Barry Friedman: “Here in the United States, policing agencies are engaging in mass collection of personal data, building a vast architecture of surveillance. License plate readers collect our location information. Mobile forensics data terminals suck in the contents of cell phones during traffic stops. CCTV maps our movements. Cheap storage means most of this is kept for long periods of time—sometimes into perpetuity. Artificial intelligence makes searching and mining the data a snap. For most of us whose data is collected, stored, and mined, there is no suspicion whatsoever of wrongdoing.

This growing network of surveillance is almost entirely unregulated. It is, in short, lawless. The Fourth Amendment touches almost none of it, either because what is captured occurs in public, and so is supposedly “knowingly exposed,” or because of doctrine that shields information collected from third parties. It is unregulated by statutes because legislative bodies—when they even know about these surveillance systems—see little profit in taking on the police.

In the face of growing concern over such surveillance, this Article argues there is a constitutional solution sitting in plain view. In virtually every other instance in which personal information is collected by the government, courts require that a sound regulatory scheme be in place before information collection occurs. The rulings on the mandatory nature of regulation are remarkably similar, no matter under which clause of the Constitution collection is challenged.

This Article excavates this enormous body of precedent and applies it to the problem of government mass data collection. It argues that before the government can engage in such surveillance, there must be a regulatory scheme in place. And by changing the default rule from allowing police to collect absent legislative prohibition, to banning collection until there is legislative action, legislatures will be compelled to act (or there will be no surveillance). The Article defines what a minimally-acceptable regulatory scheme for mass data collection must include, and shows how it can be grounded in the Constitution…(More)”.

California Governor Signs Sweeping Children’s Online Safety Bill


Article by Natasha Singer: “California will adopt a broad new approach to protecting children online after Gov. Gavin Newsom signed a bill on Thursday that could transform how many social networks, games and other services treat minors.

Despite opposition from the tech industry, the State Legislature unanimously approved the bill at the end of August. It is the first state statute in the nation requiring online services likely to be used by youngsters to install wide-ranging safeguards for users under 18.

Among other things, the measure will require sites and apps to curb the risks that certain popular features — like allowing strangers to message one another — may pose to younger users. It will also require online services to turn on the highest privacy settings by default for children.

“We’re taking aggressive action in California to protect the health and well-being of our kids,” Governor Newsom said in a statement that heralded the new law as “bipartisan landmark legislation” aimed at protecting the well-being, data and privacy of children.

Called the California Age-Appropriate Design Code Act, the new legislation compels online services to take a proactive approach to safety — by designing their products and features from the outset with the “best interests” of young users in mind.

The California measure could apply to a wide range of popular digital products that people under 18 are likely to use: social networks, game platforms, connected toys, voice assistants and digital learning tools for schools. It could also affect children far beyond the state, prompting some services to introduce changes nationwide, rather than treat minors in California differently…(More)”.

Digital Privacy for Reproductive Choice in the Post-Roe Era


Paper by Aziz Z. Huq and Rebecca Wexler: “The overruling of Roe v. Wade unleashed a torrent of regulatory and punitive activity restricting lawful reproductive options. The turn to the expansive criminal law and new schemes of civil liability creates new, and quite different, concerns from the pre-Roe landscape a half-century, ago. Reproductive choice, and its nemesis, rests on information. For pregnant people, deciding on a choice of medical care entails a search for advice and services. Information is at a premium for them. Meanwhile, efforts to regulate abortion begin with clinic closings, but quickly will extend to civil actions and criminal indictments of patients, providers, and those who facilitate abortions. Like the pregnant themselves, criminal and civil enforcers depend on information. And in the contemporary context, the informational landscape, and hence access to counseling and services such as medication abortion, is largely digital. In an era when most people use search engines or social media to access information, the digital architecture and data retention policies of those platforms will determine not only whether the pregnant can access medically accurate advice but also whether the mere act of doing so places them in legal peril.

This Article offers the first comprehensive accounting of abortion-related digital privacy after the end of Roe. It demonstrates first that digital privacy for pregnant persons in the United States has suddenly become a tremendously fraught and complex question. It then maps the treacherous social, legal and economic terrain upon which firms, individuals, and states will make privacy related decisions. Building on this political economy, we develop a moral and economic argument to the effect that digital firms should maximize digital privacy for pregnant persons within the scope of the law, and should actively resist restrictionist states’ efforts to instrumentalize them into their war on reproductive choice. We then lay out precise, tangible steps that firms should take to enact this active resistance, explaining in particular a range of powerful yet legal options for firms to refuse cooperation with restrictionist criminal and civil investigations. Finally, we present an original, concrete and immediately actionable proposal for federal and state legislative intervention: a statutory evidentiary privilege to shield abortion-relevant data from restrictionist warrants, subpoenas, court orders, and judicial proceedings…(More)”

Can Privacy Nudges be Tailored to Individuals’ Decision Making and Personality Traits?


Paper by Logan Warberg, Alessandro Acquisti and Douglas Sicker: “While the effectiveness of nudges in influencing user behavior has been documented within the literature, most prior work in the privacy field has focused on ‘one-size-fits-all’ interventions. Recent behavioral research has identified the potential of tailoring nudges to users by leveraging individual differences in decision making and personality. We present the results of three online experiments aimed at investigating whether nudges tailored to various psychometric scales can influence participants’ disclosure choices. Each study adopted a difference-in-differences design, testing whether differences in disclosure rates for participants presented with a nudge were affected by differences along various psychometric variables. Study 1 used a hypothetical disclosure scenario to measure participants’ responses to a single nudge. Study 2 and its replication (Study 3) tested responses in real disclosure scenarios to two nudges. Across all studies, we failed to find significant effects robustly linking any of the measured psychometric variables to differences in disclosure rates. We describe our study design and results along with a discussion of the practicality of using decision making and personality traits to tailor privacy nudges…(More)”.

A South African City Says It’s Putting QR Codes On Informal Settlement Cabins To Help Services. But Residents And Privacy Experts Are Uncertain.


Article by Ray Mwareya: “Cape Town, South Africa’s second wealthiest city, is piloting a new plan for the 146,000 households in its informal settlements: QR-coding their homes.

City officials say the plan is to help residents get access to government services like welfare and provide an alternative to a formal street address so they can more easily get packages delivered or hail a taxi. But privacy experts warn that the city isn’t being clear about how the data will be stored or used, and the digital identification of poor Black residents could lead to retreading Cape Town’s ugly history of discrimination.

Cape Town’s government says it has marked 1,000 cabins in unofficial settlements with QR codes and made sure every individual’s information is checked, vetted, and saved by its corporate geographic information system.

Cape Town, South Africa’s second wealthiest city, is piloting a new plan for the 146,000 households in its informal settlements: QR-coding their homes.

City officials say the plan is to help residents get access to government services like welfare and provide an alternative to a formal street address so they can more easily get packages delivered or hail a taxi. But privacy experts warn that the city isn’t being clear about how the data will be stored or used, and the digital identification of poor Black residents could lead to retreading Cape Town’s ugly history of discrimination.

Cape Town’s government says it has marked 1,000 cabins in unofficial settlements with QR codes and made sure every individual’s information is checked, vetted, and saved by its corporate geographic information system…(More)”.

EU Court Expands Definition of Sensitive Data, Prompting Legal Concerns for Companies


Article by Catherine Stupp: “Companies will be under increased pressure after Europe’s top court ruled they must apply special protections to data that firms previously didn’t consider sensitive.

Under the European Union’s General Data Protection Regulation, information about health, religion, political views and sexual orientation are considered sensitive. Companies generally aren’t allowed to process it unless they apply special safeguards.

The European Court of Justice on Aug. 1 determined that public officials in Lithuania had their sensitive data revealed because their spouses’ names were published online, which could indicate their sexual orientation. Experts say the implications will extend to other types of potentially sensitive information.

Data that might be used to infer a sensitive piece of information about a person is also sensitive, the court said. That could include unstructured data—which isn’t organized in databases and is therefore more difficult to search through and analyze—such as surveillance camera footage in a hospital that indicates a person was treated there, legal experts say. Records of a special airplane meal might reveal religious views.

The court ruling “raises a lot of practical complexities and a lot of difficulty in understanding if the data [organizations] have is sensitive or not,” said Dr. Gabriela Zanfir-Fortuna, vice president for global privacy at the Future of Privacy Forum, a think tank based in Washington, D.C.

Many companies with large data sets may not know they hold details that indirectly relate to sensitive information, privacy experts say. Identifying where that data is and deciding whether it could reveal personal details about an individual would be a huge undertaking, said Tobias Judin, head of the international section at the Norwegian data protection regulator.

“You can’t really comply with the law if your data set becomes so big that you don’t really know what’s in it,” Mr. Judin said.

The GDPR says companies can only process sensitive data in a few circumstances, such as if a person gives explicit consent for it to be used for a specified purpose.

Regulators have been grappling with the question of how to determine what is sensitive data. The Norwegian regulator last year fined gay-dating app Grindr LLC 65 million kroner, equivalent to roughly $6.7 million The regulator said the user data was sensitive because use of the app indicated their sexual orientation.

Grindr said it doesn’t require users to share that data. The company appealed in February. Mr. Judin said his office is reviewing material submitted by the company as part of its appeal. Spain’s regulator came to a different conclusion in January, and found that data Grindr shared for advertising purposes wasn’t sensitive….(More)”.