The UN is testing technology that processes data confidentially


The Economist: “Reasons of confidentiality mean that many medical, financial, educational and other personal records, from the analysis of which much public good could be derived, are in practice unavailable. A lot of commercial data are similarly sequestered. For example, firms have more granular and timely information on the economy than governments can obtain from surveys. But such intelligence would be useful to rivals. If companies could be certain it would remain secret, they might be more willing to make it available to officialdom.

A range of novel data-processing techniques might make such sharing possible. These so-called privacy-enhancing technologies (PETs) are still in the early stages of development. But they are about to get a boost from a project launched by the United Nations’ statistics division. The UN PETs Lab, which opened for business officially on January 25th, enables national statistics offices, academic researchers and companies to collaborate to carry out projects which will test various PETs, permitting technical and administrative hiccups to be identified and overcome.

The first such effort, which actually began last summer, before the PETs Lab’s formal inauguration, analysed import and export data from national statistical offices in America, Britain, Canada, Italy and the Netherlands, to look for anomalies. Those could be a result of fraud, of faulty record keeping or of innocuous re-exporting.

For the pilot scheme, the researchers used categories already in the public domain—in this case international trade in things such as wood pulp and clocks. They thus hoped to show that the system would work, before applying it to information where confidentiality matters.

They put several kinds of PETs through their paces. In one trial, OpenMined, a charity based in Oxford, tested a technique called secure multiparty computation (SMPC). This approach involves the data to be analysed being encrypted by their keeper and staying on the premises. The organisation running the analysis (in this case OpenMined) sends its algorithm to the keeper, who runs it on the encrypted data. That is mathematically complex, but possible. The findings are then sent back to the original inquirer…(More)”.

Why Privacy Matters


Book by Neil Richards: “Many people tell us that privacy is dead, or that it is dying, but such talk is a dangerous fallacy. This book explains what privacy is, what privacy isn’t, and why privacy matters. Privacy is the extent to which human information is known or used, and it is fundamentally about the social power that human information provides over other people. The best way to ensure that power is checked and channeled in ways that benefit humans and their society is through rules—rules about human information. And because human information rules of some sort are inevitable, we should craft our privacy rules to promote human values. The book suggests three such values that our human information rules should promote: identity, freedom, and protection. Identity allows us to be thinking, self-defining humans; freedom lets us be citizens; while protection safeguards our roles as situated consumers and workers, allowing us, as members of society, to trust and rely on other people so that we can live our lives and hopefully build a better future together…(More)”.

Octagon Measurement: Public Attitudes toward AI Ethics


Paper by Yuko Ikkatai, Tilman Hartwig, Naohiro Takanashi & Hiromi M. Yokoyama: “Artificial intelligence (AI) is rapidly permeating our lives, but public attitudes toward AI ethics have only partially been investigated quantitatively. In this study, we focused on eight themes commonly shared in AI guidelines: “privacy,” “accountability,” “safety and security,” “transparency and explainability,” “fairness and non-discrimination,” “human control of technology,” “professional responsibility,” and “promotion of human values.” We investigated public attitudes toward AI ethics using four scenarios in Japan. Through an online questionnaire, we found that public disagreement/agreement with using AI varied depending on the scenario. For instance, anxiety over AI ethics was high for the scenario where AI was used with weaponry. Age was significantly related to the themes across the scenarios, but gender and understanding of AI differently related depending on the themes and scenarios. While the eight themes need to be carefully explained to the participants, our Octagon measurement may be useful for understanding how people feel about the risks of the technologies, especially AI, that are rapidly permeating society and what the problems might be…(More)”.

Privacy Is Power: How Tech Policy Can Bolster Democracy


Essay by Andrew Imbrie, Daniel Baer, Andrew Trask, Anna Puglisi, Erik Brattberg, and Helen Toner: “…History is rarely forgiving, but as we adopt the next phase of digital tools, policymakers can avoid the errors of the past. Privacy-enhancing technologies, or PETs, are a collection of technologies with applications ranging from improved medical diagnostics to secure voting systems and messaging platforms. PETs allow researchers to harness big data to solve problems affecting billions of people while also protecting privacy. …

PETs are ripe for coordination among democratic allies and partners, offering a way for them to jointly develop standards and practical applications that benefit the public good. At an AI summit last July, U.S. Secretary of State Antony Blinken noted the United States’ interest in “increasing access to shared public data sets for AI training and testing, while still preserving privacy,” and National Security Adviser Jake Sullivan pointed to PETs as a promising area “to overcome data privacy challenges while still delivering the value of big data.” Given China’s advantages in scale, the United States and like-minded partners should foster emerging technologies that play to their strengths in medical research and discovery, energy innovation, trade facilitation, and reform around money laundering. Driving innovation and collaboration within and across democracies is important not only because it will help ensure those societies’ success but also because there will be a first-mover advantage in the adoption of PETs for governing the world’s private data–sharing networks.

Accelerating the development of PETs for the public good will require an international approach. Democratic governments will not be the trendsetters on PETs; instead, policymakers for these governments should focus on nurturing the ecosystems these technologies need to flourish. The role for policymakers is not to decide the fate of specific protocols or techniques but rather to foster a conducive environment for researchers to experiment widely and innovate responsibly.    

Democracies should identify shared priorities and promote basic research to mature the technological foundations of PETs. The underlying technologies require greater investment in algorithmic development and hardware to optimize the chips and mitigate the costs of network overhead. To support the computational requirements for PETs, for example, the National Science Foundation could create an interface through CloudBank and provide cloud compute credits to researchers without access to these resources. The United States could also help incubate an international network of research universities collaborating on these technologies.

Second, science-funding agencies in democracies should host competitions to incentivize new PETs protocols and standards—the collaboration between the United States and the United Kingdom announced in early December is a good example. The goal should be to create free, open-source protocols and avoid the fragmentation of the market and the proliferation of proprietary standards. The National Institute of Standards and Technology and other similar bodies should develop standards and measurement tools for PETs; governments and companies should form public-private partnerships to fund open-source protocols over the long term. Open-source protocols are especially important in the early days of PET development, because closed-source PET implementations by profit-seeking actors can be leveraged to build data monopolies. For example, imagine a scenario where all U.S. cancer data could be controlled by a single company because all the hospitals are running their proprietary software. And you have to become a customer to join the network…(More)”.

Data trust and data privacy in the COVID-19 period


Paper by Nicholas Biddle et al: “In this article, we focus on data trust and data privacy, and how attitudes may be changing during the COVID-19 period. On balance, it appears that Australians are more trusting of organizations with regards to data privacy and less concerned about their own personal information and data than they were prior to the spread of COVID-19. The major determinant of this change in trust with regards to data was changes in general confidence in government institutions. Despite this improvement in trust with regards to data privacy, trust levels are still low….(More)”.

Surveillance Publishing


Working paper by Jefferson D. Pooley: “…This essay lingers on a prediction too: Clarivate’s business model is coming for scholarly publishing. Google is one peer, but the company’s real competitors are Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE. Elsevier, in particular, has been moving into predictive analytics for years now. Of course the publishing giants have long profited off of academics and our university employers— by packaging scholars’ unpaid writing-and-editing labor only to sell it back to us as usuriously priced subscriptions or APCs. That’s a lucrative business that Elsevier and the others won’t give up. But they’re layering another business on top of their legacy publishing operations, in the Clarivate mold. The data trove that publishers are sitting on is, if anything, far richer than the citation graph alone. Why worry about surveillance publishing? One reason is the balance-sheet, since the companies’ trading in academic futures will further pad profits at the expense of taxpayers and students. The bigger reason is that our behavior—once alienated from us and abstracted into predictive metrics—will double back onto our work lives. Existing biases, like male academics’ propensity for selfcitation, will receive a fresh coat of algorithmic legitimacy. More broadly, the academic reward system is already distorted by metrics. To the extent that publishers’ tallies and indices get folded into grant-making, tenure-and-promotion, and other evaluative decisions, the metric tide will gain power. The biggest risk is that scholars will internalize an analytics mindset, one already encouraged by citation counts and impact factors….(More)”.

Legal study on Government access to data in third countries


Report commissioned by the European Data Protection Board: “The present report is part of a study analysing the implications for the work of the European Union (EU)/ European Economic Area (EEA) data protection supervisory authorities (SAs) in relation to transfers of personal data to third countries after the Court of Justice of the European Union (CJEU) judgment C- 311/18 on Data Protection Commissioner v. Facebook Ireland Ltd, Maximilian Schrems (Schrems II). Data controllers and processors may transfer personal data to third countries or international organisations only if the controller or processor has provided appropriate safeguards, and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.

Whereas it is the primary responsibility of data exporters and data importers to assess that the legislation of the country of destination enables the data importer to comply with any of the appropriate safeguards, SAs will play a key role when issuing further decisions on transfers to third countries. Hence, this report provides the European Data Protection Board (EDPB) and the SAs in the EEA/EU with information on the legislation and practice in China, India and Russia on their governments’ access to personal data processed by economic operators. The report contains an overview of the relevant information in order for the SAs to assess whether and to what extent legislation and practices in the abovementioned countries imply massive and/or indiscriminate access to personal data processed by economic operators…(More)”.

A data ‘black hole’: Europol ordered to delete vast store of personal data


Article by Apostolis Fotiadis, Ludek Stavinoha, Giacomo Zandonini, Daniel Howden: “…The EU’s police agency, Europol, will be forced to delete much of a vast store of personal data that it has been found to have amassed unlawfully by the bloc’s data protection watchdog. The unprecedented finding from the European Data Protection Supervisor (EDPS) targets what privacy experts are calling a “big data ark” containing billions of points of information. Sensitive data in the ark has been drawn from crime reports, hacked from encrypted phone services and sampled from asylum seekers never involved in any crime.

According to internal documents seen by the Guardian, Europol’s cache contains at least 4 petabytes – equivalent to 3m CD-Roms or a fifth of the entire contents of the US Library of Congress. Data protection advocates say the volume of information held on Europol’s systems amounts to mass surveillance and is a step on its road to becoming a European counterpart to the US National Security Agency (NSA), the organisation whose clandestine online spying was revealed by whistleblower Edward Snowden….(More)”.

The Crowdsourced Panopticon


Book by Jeremy Weissman: “Behind the omnipresent screens of our laptops and smartphones, a digitally networked public has quickly grown larger than the population of any nation on Earth. On the flipside, in front of the ubiquitous recording devices that saturate our lives, individuals are hyper-exposed through a worldwide online broadcast that encourages the public to watch, judge, rate, and rank people’s lives. The interplay of these two forces – the invisibility of the anonymous crowd and the exposure of the individual before that crowd – is a central focus of this book. Informed by critiques of conformity and mass media by some of the greatest philosophers of the past two centuries, as well as by a wide range of historical and empirical studies, Weissman helps shed light on what may happen when our lives are increasingly broadcast online for everyone all the time, to be judged by the global community…(More)”.

Data for Common Purpose: Leveraging Consent to Build Trust


Report by the World Economic Forum: “Over the past few decades, the digital world has been a breeding ground for bad actors, data breaches, and dark patterns of data collection and use. Shifting individuals’ perceptions away from skepticism to a position of trust is no easy task with no easy answers. This report provides a pragmatic approach to strengthen the engagement of individuals and positively affect the experiences of those who contribute data for the common good…(More)”.