Conflicts over access to Americans’ personal data emerging across federal government


Article by Caitlin Andrews: “The Trump administration’s fast-moving efforts to limit the size of the U.S. federal bureaucracy, primarily through the recently minted Department of Government Efficiency, are raising privacy and data security concerns among current and former officials across the government, particularly as the administration scales back positions charged with privacy oversight. Efforts to limit the independence of a host of federal agencies through a new executive order — including the independence of the Federal Trade Commission and Securities and Exchange Commission — are also ringing alarm bells among civil society and some legal experts.

According to CNN, several staff within the Office of Personnel Management’s privacy and records keeping department were fired last week. Staff who handle communications and respond to Freedom of Information Act requests were also let go. Though the entire privacy team was not fired, according to the OPM, details about what kind of oversight will remain within the department were limited. The report also states the staff’s termination date is 15 April.

It is one of several moves the Trump administration has made in recent days reshaping how entities access and provide oversight to government agencies’ information.

The New York Times reports on a wide range of incidents within the government where DOGE’s efforts to limit fraudulent government spending by accessing sensitive agency databases have run up against staffers who are concerned about the privacy of Americans’ personal information. In one incident, Social Security Administration acting Commissioner Michelle King was fired after resisting a request from DOGE to access the agency’s database. “The episode at the Social Security Administration … has played out repeatedly across the federal government,” the Times reported…(More)”.

On Privacy and Technology


Book by Daniel J. Solove: “With the rapid rise of new digital technologies and artificial intelligence, is privacy dead? Can anything be done to save us from a dystopian world without privacy?

In this short and accessible book, internationally renowned privacy expert Daniel J. Solove draws from a range of fields, from law to philosophy to the humanities, to illustrate the profound changes technology is wreaking upon our privacy, why they matter, and what can be done about them. Solove provides incisive examinations of key concepts in the digital sphere, including control, manipulation, harm, automation, reputation, consent, prediction, inference, and many others.

Compelling and passionate, On Privacy and Technology teems with powerful insights that will transform the way you think about privacy and technology…(More)”.

Empowering open data sharing for social good: a privacy-aware approach


Paper by Tânia Carvalho et al: “The Covid-19 pandemic has affected the world at multiple levels. Data sharing was pivotal for advancing research to understand the underlying causes and implement effective containment strategies. In response, many countries have facilitated access to daily cases to support research initiatives, fostering collaboration between organisations and making such data available to the public through open data platforms. Despite the several advantages of data sharing, one of the major concerns before releasing health data is its impact on individuals’ privacy. Such a sharing process should adhere to state-of-the-art methods in Data Protection by Design and by Default. In this paper, we use a Covid-19 data set from Portugal’s second-largest hospital to show how it is feasible to ensure data privacy while improving the quality and maintaining the utility of the data. Our goal is to demonstrate how knowledge exchange in multidisciplinary teams of healthcare practitioners, data privacy, and data science experts is crucial to co-developing strategies that ensure high utility in de-identified data…(More).”

Big brother: the effects of surveillance on fundamental aspects of social vision


Paper by Kiley Seymour et al: “Despite the dramatic rise of surveillance in our societies, only limited research has examined its effects on humans. While most research has focused on voluntary behaviour, no study has examined the effects of surveillance on more fundamental and automatic aspects of human perceptual awareness and cognition. Here, we show that being watched on CCTV markedly impacts a hardwired and involuntary function of human sensory perception—the ability to consciously detect faces. Using the method of continuous flash suppression (CFS), we show that when people are surveilled (N = 24), they are quicker than controls (N = 30) to detect faces. An independent control experiment (N = 42) ruled out an explanation based on demand characteristics and social desirability biases. These findings show that being watched impacts not only consciously controlled behaviours but also unconscious, involuntary visual processing. Our results have implications concerning the impacts of surveillance on basic human cognition as well as public mental health…(More)”.

Protecting civilians in a data-driven and digitalized battlespace: Towards a minimum basic technology infrastructure


Paper by Ann Fitz-Gerald and Jenn Hennebry: “This article examines the realities of modern day warfare, including a rising trend in hybrid threats and irregular warfare which employ emerging technologies supported by digital and data-driven processes. The way in which these technologies become applied generates a widened battlefield and leads to a greater number of civilians being caught up in conflict. Humanitarian groups mandated to protect civilians have adapted their approaches to the use of new emerging technologies. However, the lack of international consensus on the use of data, the public and private nature of the actors involved in conflict, the transnational aspects of the widened battlefield, and the heightened security risks in the conflict space pose enormous challenges for the protection of civilians agenda. Based on the dual-usage aspect of emerging technologies, the challenges associated with regulation and the need for those affected by conflict to demonstrate resilience towards, and knowledge of, digital media literacy, this paper proposes the development of guidance for a “minimum basic technology infrastructure” which is supported by technology, regulation, and public awareness and education…(More)”.

Congress should designate an entity to oversee data security, GAO says


Article by Matt Bracken: “Federal agencies may need to rethink how they handle individuals’ personal data to protect their civil rights and civil liberties, a congressional watchdog said in a new report Tuesday.

Without federal guidance governing the protection of the public’s civil rights and liberties, agencies have pursued a patchwork system of policies tied to the collection, sharing and use of data, the Government Accountability Office said

To address that problem head-on, the GAO is recommending that Congress select “an appropriate federal entity” to produce guidance or regulations regarding data protection that would apply to all agencies, giving that entity “the explicit authority to make needed technical and policy choices or explicitly stating Congress’s own choices.”

That recommendation was formed after the GAO sent a questionnaire to all 24 Chief Financial Officers Act agencies asking for information about their use of emerging technologies and data capabilities and how they’re guaranteeing that personally identifiable information is safeguarded.

The GAO found that 16 of those CFO Act agencies have policies or procedures in place to protect civil rights and civil liberties with regard to data use, while the other eight have not taken steps to do the same.

The most commonly cited issues for agencies in their efforts to protect the civil rights and civil liberties of the public were “complexities in handling protections associated with new and emerging technologies” and “a lack of qualified staff possessing needed skills in civil rights, civil liberties, and emerging technologies.”

“Further, eight of the 24 agencies believed that additional government-wide law or guidance would strengthen consistency in addressing civil rights and civil liberties protections,” the GAO wrote. “One agency noted that such guidance could eliminate the hodge-podge approach to the governance of data and technology.”

All 24 CFO Act agencies have internal offices to “handle the protection of the public’s civil rights as identified in federal laws,” with much of that work centered on the handling of civil rights violations and related complaints. Four agencies — the departments of Defense, Homeland Security, Justice and Education — have offices to specifically manage civil liberty protections across their entire agencies. The other 20 agencies have mostly adopted a “decentralized approach to protecting civil liberties, including when collecting, sharing, and using data,” the GAO noted…(More)”.

Privacy during pandemics: Attitudes to public use of personal data


Paper by Eleonora Freddi and Ole Christian Wasenden: “In this paper we investigate people’s attitudes to privacy and sharing of personal data when used to help society combat a contagious disease, such as COVID-19. Through a two-wave survey, we investigate the role of personal characteristics, and the effect of information, in shaping privacy attitudes. By conducting the survey in Norway and Sweden, which adopted very different strategies to handle the COVID-19 pandemic, we analyze potential differences in privacy attitudes due to policy changes. We find that privacy concern is negatively correlated with allowing public use of personal data. Trust in the entity collecting data and collectivist preferences are positively correlated with this type of data usage. Providing more information about the public benefit of sharing personal data makes respondents more positive to the use of their data, while providing additional information about the costs associated with data sharing does not change attitudes. The analysis suggests that stating a clear purpose and benefit for the data collection makes respondents more positive about sharing. Despite very different policy approaches, we do not find any major differences in privacy attitudes between Norway and Sweden. Findings are also similar between the two survey waves, suggesting a minor role for contextual changes…(More)”

What’s the Value of Privacy?


Brief by New America: “On a day-to-day basis, people make decisions about what information to share and what information to keep to themselves—guided by an inner privacy compass. Privacy is a concept that is both evocative and broad, often possessing different meanings for different people. The term eludes a commonstatic definition, though it is now inextricably linked to technology and a growing sense that individuals do not have control over their personal information. If privacy still, at its core, encompasses “the right to be left alone,” then that right is increasingly difficult to exercise in the modern era. 

The inability to meaningfully choose privacy is not an accident—in fact, it’s often by design. Society runs on data. Whether it is data about people’s personal attributespreferences, or actions, all that data can be linked together, becoming greater than the sum of its parts. If data is now the world’s most valuable resource, then the companies that are making record profits off that data are highly incentivized to keep accessing it and obfuscating the externalities of data sharing. In brief, data use and privacy are “economically significant.” 

And yet, despite the pervasive nature of data collection, much of the public lacks a nuanced understanding of the true costs and benefits of sharing their data—for themselves and for society as a whole. People who have made billions by collecting and re-selling individual user data will continue to claim that it has little value. And yet, there are legitimate reasons why data should be shared—without a clear understanding of an issue, it is impossible to address it…(More)”.

Someone Put Facial Recognition Tech onto Meta’s Smart Glasses to Instantly Dox Strangers


Article by Joseph Cox: “A pair of students at Harvard have built what big tech companies refused to release publicly due to the overwhelming risks and danger involved: smart glasses with facial recognition technology that automatically looks up someone’s face and identifies them. The students have gone a step further too. Their customized glasses also pull other information about their subject from around the web, including their home address, phone number, and family members. 

The project is designed to raise awareness of what is possible with this technology, and the pair are not releasing their code, AnhPhu Nguyen, one of the creators, told 404 Media. But the experiment, tested in some cases on unsuspecting people in the real world according to a demo video, still shows the razor thin line between a world in which people can move around with relative anonymity, to one where your identity and personal information can be pulled up in an instant by strangers.

Nguyen and co-creator Caine Ardayfio call the project I-XRAY. It uses a pair of Meta’s commercially available Ray Ban smart glasses, and allows a user to “just go from face to name,” Nguyen said…(More)”.

Data Privacy for Record Linkage and Beyond


Paper by Shurong Lin & Eric Kolaczyk: “In a data-driven world, two prominent research problems are record linkage and data privacy, among others. Record linkage is essential for improving decision-making by integrating information of the same entities from different sources. On the other hand, data privacy research seeks to balance the need to extract accurate insights from data with the imperative to protect the privacy of the entities involved. Inevitably, data privacy issues arise in the context of record linkage. This article identifies two complementary aspects at the intersection of these two fields: (1) how to ensure privacy during record linkage and (2) how to mitigate privacy risks when releasing the analysis results after record linkage. We specifically discuss privacy-preserving record linkage, differentially private regression, and related topics…(More)”.