Octagon Measurement: Public Attitudes toward AI Ethics


Paper by Yuko Ikkatai, Tilman Hartwig, Naohiro Takanashi & Hiromi M. Yokoyama: “Artificial intelligence (AI) is rapidly permeating our lives, but public attitudes toward AI ethics have only partially been investigated quantitatively. In this study, we focused on eight themes commonly shared in AI guidelines: “privacy,” “accountability,” “safety and security,” “transparency and explainability,” “fairness and non-discrimination,” “human control of technology,” “professional responsibility,” and “promotion of human values.” We investigated public attitudes toward AI ethics using four scenarios in Japan. Through an online questionnaire, we found that public disagreement/agreement with using AI varied depending on the scenario. For instance, anxiety over AI ethics was high for the scenario where AI was used with weaponry. Age was significantly related to the themes across the scenarios, but gender and understanding of AI differently related depending on the themes and scenarios. While the eight themes need to be carefully explained to the participants, our Octagon measurement may be useful for understanding how people feel about the risks of the technologies, especially AI, that are rapidly permeating society and what the problems might be…(More)”.

Privacy Is Power: How Tech Policy Can Bolster Democracy


Essay by Andrew Imbrie, Daniel Baer, Andrew Trask, Anna Puglisi, Erik Brattberg, and Helen Toner: “…History is rarely forgiving, but as we adopt the next phase of digital tools, policymakers can avoid the errors of the past. Privacy-enhancing technologies, or PETs, are a collection of technologies with applications ranging from improved medical diagnostics to secure voting systems and messaging platforms. PETs allow researchers to harness big data to solve problems affecting billions of people while also protecting privacy. …

PETs are ripe for coordination among democratic allies and partners, offering a way for them to jointly develop standards and practical applications that benefit the public good. At an AI summit last July, U.S. Secretary of State Antony Blinken noted the United States’ interest in “increasing access to shared public data sets for AI training and testing, while still preserving privacy,” and National Security Adviser Jake Sullivan pointed to PETs as a promising area “to overcome data privacy challenges while still delivering the value of big data.” Given China’s advantages in scale, the United States and like-minded partners should foster emerging technologies that play to their strengths in medical research and discovery, energy innovation, trade facilitation, and reform around money laundering. Driving innovation and collaboration within and across democracies is important not only because it will help ensure those societies’ success but also because there will be a first-mover advantage in the adoption of PETs for governing the world’s private data–sharing networks.

Accelerating the development of PETs for the public good will require an international approach. Democratic governments will not be the trendsetters on PETs; instead, policymakers for these governments should focus on nurturing the ecosystems these technologies need to flourish. The role for policymakers is not to decide the fate of specific protocols or techniques but rather to foster a conducive environment for researchers to experiment widely and innovate responsibly.    

Democracies should identify shared priorities and promote basic research to mature the technological foundations of PETs. The underlying technologies require greater investment in algorithmic development and hardware to optimize the chips and mitigate the costs of network overhead. To support the computational requirements for PETs, for example, the National Science Foundation could create an interface through CloudBank and provide cloud compute credits to researchers without access to these resources. The United States could also help incubate an international network of research universities collaborating on these technologies.

Second, science-funding agencies in democracies should host competitions to incentivize new PETs protocols and standards—the collaboration between the United States and the United Kingdom announced in early December is a good example. The goal should be to create free, open-source protocols and avoid the fragmentation of the market and the proliferation of proprietary standards. The National Institute of Standards and Technology and other similar bodies should develop standards and measurement tools for PETs; governments and companies should form public-private partnerships to fund open-source protocols over the long term. Open-source protocols are especially important in the early days of PET development, because closed-source PET implementations by profit-seeking actors can be leveraged to build data monopolies. For example, imagine a scenario where all U.S. cancer data could be controlled by a single company because all the hospitals are running their proprietary software. And you have to become a customer to join the network…(More)”.

Data trust and data privacy in the COVID-19 period


Paper by Nicholas Biddle et al: “In this article, we focus on data trust and data privacy, and how attitudes may be changing during the COVID-19 period. On balance, it appears that Australians are more trusting of organizations with regards to data privacy and less concerned about their own personal information and data than they were prior to the spread of COVID-19. The major determinant of this change in trust with regards to data was changes in general confidence in government institutions. Despite this improvement in trust with regards to data privacy, trust levels are still low….(More)”.

Surveillance Publishing


Working paper by Jefferson D. Pooley: “…This essay lingers on a prediction too: Clarivate’s business model is coming for scholarly publishing. Google is one peer, but the company’s real competitors are Elsevier, Springer Nature, Wiley, Taylor & Francis, and SAGE. Elsevier, in particular, has been moving into predictive analytics for years now. Of course the publishing giants have long profited off of academics and our university employers— by packaging scholars’ unpaid writing-and-editing labor only to sell it back to us as usuriously priced subscriptions or APCs. That’s a lucrative business that Elsevier and the others won’t give up. But they’re layering another business on top of their legacy publishing operations, in the Clarivate mold. The data trove that publishers are sitting on is, if anything, far richer than the citation graph alone. Why worry about surveillance publishing? One reason is the balance-sheet, since the companies’ trading in academic futures will further pad profits at the expense of taxpayers and students. The bigger reason is that our behavior—once alienated from us and abstracted into predictive metrics—will double back onto our work lives. Existing biases, like male academics’ propensity for selfcitation, will receive a fresh coat of algorithmic legitimacy. More broadly, the academic reward system is already distorted by metrics. To the extent that publishers’ tallies and indices get folded into grant-making, tenure-and-promotion, and other evaluative decisions, the metric tide will gain power. The biggest risk is that scholars will internalize an analytics mindset, one already encouraged by citation counts and impact factors….(More)”.

Legal study on Government access to data in third countries


Report commissioned by the European Data Protection Board: “The present report is part of a study analysing the implications for the work of the European Union (EU)/ European Economic Area (EEA) data protection supervisory authorities (SAs) in relation to transfers of personal data to third countries after the Court of Justice of the European Union (CJEU) judgment C- 311/18 on Data Protection Commissioner v. Facebook Ireland Ltd, Maximilian Schrems (Schrems II). Data controllers and processors may transfer personal data to third countries or international organisations only if the controller or processor has provided appropriate safeguards, and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available.

Whereas it is the primary responsibility of data exporters and data importers to assess that the legislation of the country of destination enables the data importer to comply with any of the appropriate safeguards, SAs will play a key role when issuing further decisions on transfers to third countries. Hence, this report provides the European Data Protection Board (EDPB) and the SAs in the EEA/EU with information on the legislation and practice in China, India and Russia on their governments’ access to personal data processed by economic operators. The report contains an overview of the relevant information in order for the SAs to assess whether and to what extent legislation and practices in the abovementioned countries imply massive and/or indiscriminate access to personal data processed by economic operators…(More)”.

A data ‘black hole’: Europol ordered to delete vast store of personal data


Article by Apostolis Fotiadis, Ludek Stavinoha, Giacomo Zandonini, Daniel Howden: “…The EU’s police agency, Europol, will be forced to delete much of a vast store of personal data that it has been found to have amassed unlawfully by the bloc’s data protection watchdog. The unprecedented finding from the European Data Protection Supervisor (EDPS) targets what privacy experts are calling a “big data ark” containing billions of points of information. Sensitive data in the ark has been drawn from crime reports, hacked from encrypted phone services and sampled from asylum seekers never involved in any crime.

According to internal documents seen by the Guardian, Europol’s cache contains at least 4 petabytes – equivalent to 3m CD-Roms or a fifth of the entire contents of the US Library of Congress. Data protection advocates say the volume of information held on Europol’s systems amounts to mass surveillance and is a step on its road to becoming a European counterpart to the US National Security Agency (NSA), the organisation whose clandestine online spying was revealed by whistleblower Edward Snowden….(More)”.

The Crowdsourced Panopticon


Book by Jeremy Weissman: “Behind the omnipresent screens of our laptops and smartphones, a digitally networked public has quickly grown larger than the population of any nation on Earth. On the flipside, in front of the ubiquitous recording devices that saturate our lives, individuals are hyper-exposed through a worldwide online broadcast that encourages the public to watch, judge, rate, and rank people’s lives. The interplay of these two forces – the invisibility of the anonymous crowd and the exposure of the individual before that crowd – is a central focus of this book. Informed by critiques of conformity and mass media by some of the greatest philosophers of the past two centuries, as well as by a wide range of historical and empirical studies, Weissman helps shed light on what may happen when our lives are increasingly broadcast online for everyone all the time, to be judged by the global community…(More)”.

Data for Common Purpose: Leveraging Consent to Build Trust


Report by the World Economic Forum: “Over the past few decades, the digital world has been a breeding ground for bad actors, data breaches, and dark patterns of data collection and use. Shifting individuals’ perceptions away from skepticism to a position of trust is no easy task with no easy answers. This report provides a pragmatic approach to strengthen the engagement of individuals and positively affect the experiences of those who contribute data for the common good…(More)”.

Amsterdam introduces mandatory register for sensors


Sarah Wray at Cities Today: “Private companies, research institutions and government organisations in Amsterdam are now obliged to report sensors deployed in public spaces.

The information is being displayed via an online map to give residents more insight into how, where and what data is collected from sources such as cameras, air quality and traffic sensors, Wi-Fi counters and smart billboards. The map shows the type of sensor, the owner and whether personal data is processed.

A statement from the city said: “Amsterdam believes that residents have the right to know where and when data is collected. The sensor register and the reporting obligation help to create awareness. It is one of the 18 actions from the Amsterdam Data Strategy.”

The requirement applies to new sensors and those that are already installed in the city, including mobile sensors.

So far, only sensors from the City of Amsterdam have been included in the register. Other owners are now urged to report their sensors and have until 1 June 2022 before enforcement action will be taken.

If there is no response even after warnings, the municipality can remove the sensor at the owner’s expense, the city said.

The obligation to report sensors  is part of a regulation update recently passed by the City Council…(More)”.

Pandemic Privacy


A Preliminary Analysis of Collection Technologies, Data Collection Laws, and Legislative Reform during COVID-19 by Benjamin Ballard, Amanda Cutinha, and Christopher Parsons: “…a preliminary comparative analysis of how different information technologies were mobilized in response to COVID-19 to collect data, the extent to which Canadian health or privacy or emergencies laws impeded the response to COVID-19, and ultimately, the potential consequences of reforming data protection or privacy laws to enable more expansive data collection, use, or disclosure of personal information in future health emergencies. In analyzing how data has been collected in the United States, United Kingdom, and Canada, we found that while many of the data collection methods could be mapped onto a trajectory of past collection practices, the breadth and extent of data collection in tandem with how communications networks were repurposed constituted novel technological responses to a health crisis. Similarly, while the intersection of public and private interests in providing healthcare and government services is not new, the ability for private companies such as Google and Apple to forcefully shape some of the technology-enabled pandemic responses speaks to the significant ability of private companies to guide or direct public health measures that rely on contemporary smartphone technologies. While we found that the uses of technologies were linked to historical efforts to combat the spread of disease, the nature and extent of private surveillance to enable public action was arguably unprecedented….(More)”.