Lessons learned from telco data informing COVID-19 responses: toward an early warning system for future pandemics?


Introduction to a special issue of Data and Policy (Open Access) by Richard Benjamins, Jeanine Vos, and Stefaan Verhulst: “More than a year into the COVID-19 pandemic, the damage is still unfolding. While some countries have recently managed to gain an element of control through aggressive vaccine campaigns, much of the developing world — South and Southeast Asia in particular — remain in a state of crisis. Given what we now know about the global nature of this disease and the potential for mutant versions to develop and spread, a crisis anywhere is cause for concern everywhere. The world remains very much in the grip of this public health crisis.

From the beginning, there has been hope that data and technology could offer solutions to help inform the government’s response strategy and decision-making. Many of the expectations have been focused on mobile data analytics in particular, whereby mobile network operators create mobility insights and decision-support tools generated from anonymized and aggregated telco data. This stems both from a growing group of mobile network operators having significantly invested in systems and capabilities to develop such products and services for public and private sector customers. As well as their value having been demonstrated in addressing different global challenges, ranging from models to better understand the spread of Zika in Brazil to interactive dashboards to aid emergency services during earthquakes and floods in Japan. Yet despite these experiences, many governments across the world still have limited awareness, capabilities and resources to leverage these tools, in their efforts to limit the spread of COVID-19 using non-pharmaceutical interventions (NPI), both from a medical and economic point of view.

Today, we release the first batch of papers of a special collection of Data & Policy that examines both the potential of mobile data, as well as the challenges faced in delivering these tools to inform government decision-making. Consisting of 5 papers from 33 researchers and experts from academia, industry and government, the articles cover a wide range of geographies, including Europe, Argentina, Brazil, Ecuador, France, Gambia, Germany, Ghana, Austria, Belgium, and Spain. Responding to our call for case studies to illustrate the opportunities (and challenges) offered by mobile big data in the fight against COVID-19, the authors of these papers describe a number of examples of how mobile and mobile-related data have been used to address the medical, economic, socio-cultural and political aspects of the pandemic….(More)”.

Virtual Juries


Paper by Valerie P. Hans: “The introduction of virtual or remote jury trials in response to the COVID-19 pandemic constitutes a remarkable natural experiment with one of our nation’s central democratic institutions. Although it is not a tightly controlled experimental study, real world experiences in this natural experiment offer some insights about how key features of trial by jury are affected by a virtual procedure. This article surveys the landscape of virtual jury trials. It examines the issues of jury representativeness, the adequacy of virtual jury selection, the quality of decision making, and the public’s access to jury trial proceedings. Many have expressed concern that the digital divide would negatively affect jury representativeness. Surprisingly, there is some preliminary evidence that suggests that virtual jury selection procedures lead to jury venires that are as diverse, if not more diverse, than pre-pandemic jury venires. Lawyers in a demonstration project reacted favorably to virtual voir dire when it was accompanied by expansive pretrial juror questionnaires and the opportunity to question prospective jurors. A number of courts provided public access by live streaming jury trials. How a virtual jury trial affects jurors’ interpretations of witness testimony, attorney arguments, and jury deliberation remain open questions….(More)”

A Literature Review of E-government Services with Gamification Elements


Paper by Ruth S. Contreras-Espinosa and Alejandro Blanco-M: “Many democracies face breaches of communication between citizens and political representatives, resulting in low engagement in political decision-making and public consultations. Gamification strategies can be implemented to generate constructive relationships and increase citizens’ motivation and participation by including positive experiences like achievements. This document contains a literature review of the gamification topic, providing a conceptual background, and presenting a selection and analysis of the applications to e-government services. The study characterises gamification element usage and highlights the need for a standardised methodology during element selection. Three research gaps were identified, with a potential impact on future studies and e-government applications….(More)”.

Small-Scale Deliberation and Mass Democracy: A Systematic Review of the Spillover Effects of Deliberative Minipublics


Paper by Ramon van der Does and Vincent Jacquet: “Deliberative minipublics are popular tools to address the current crisis in democracy. However, it remains ambiguous to what degree these small-scale forums matter for mass democracy. In this study, we ask the question to what extent minipublics have “spillover effects” on lay citizens—that is, long-term effects on participating citizens and effects on non-participating citizens. We answer this question by means of a systematic review of the empirical research on minipublics’ spillover effects published before 2019. We identify 60 eligible studies published between 1999 and 2018 and provide a synthesis of the empirical results. We show that the evidence for most spillover effects remains tentative because the relevant body of empirical evidence is still small. Based on the review, we discuss the implications for democratic theory and outline several trajectories for future research…(More)”.

Here Be Dragons – Maintaining Trust in the Technologized Public Sector


Paper by Balázs Bodó and Heleen Janssen: “Emerging technologies, such as AI systems, distributed ledgers, but also private e-commerce and telecommunication platforms have permeated every aspect of our social, economic, political relations. Various bodies of the state, from education, via law enforcement to healthcare also increasingly rely on technical components to provide cheap, efficient public services, and supposedly fair, transparent, disinterested, accountable public administration. Most of these technical components are provided by private parties who designed, developed, trained, and maintain the technical components of public infrastructures.
The rapid, and often unplanned, and uncontrolled technologization of public services (as happened, for example in the rapid adoption of distance learning and teleconferencing systems during the COVID lockdowns) inseparably link the perception of the quality, trustworthiness, effectiveness of public services and the public bodies which provision them to the successes and failures of their private, technological components: if the government’s welfare fraud AI system fails, it is the confidence in the governments which is ultimately hit.


In this contribution we explore how the use of potentially untrustworthy private technological systems in the public sector may affect the trust in government. We argue that citizens’ and business’ trust in government is a valuable asset, which came under assault from many dimensions. The increasing reliance on private technical components in government is in part a response to protect this trust, but in many cases, it opens up new forms of threats and vulnerabilities, because the trustworthiness of many of these private technical systems is, at best, questionable, particularly where it is deployed in the context of public sector trust contexts. We consider a number of policy options to protect the trust in government even if some of their technological components are fundamentally untrustworthy….(More)”.

The Returns to Public Library Investment


Working Paper by the Federal Reserve Bank of Chicago: “Local governments spend over 12 billion dollars annually funding the operation of 15,000 public libraries in the United States. This funding supports widespread library use: more than 50% of Americans visit public libraries each year. But despite extensive public investment in libraries, surprisingly little research quantities the effects of public libraries on communities and children. We use data on the near-universe of U.S. public libraries to study the effects of capital spending shocks on library resources, patron usage, student achievement, and local housing prices. We use a dynamic difference-in-difference approach to show that library capital investment increases children’s attendance at library events by 18%, children’s checkouts of items by 21%, and total library visits by 21%. Increases in library use translate into improved children’s test scores in nearby school districts: a $1,000 or greater per-student capital investment in local public libraries increases reading test scores by 0.02 standard deviations and has no effects on math test scores. Housing prices do not change after a sharp increase in public library capital investment, suggesting that residents internalize the increased cost and improved quality of their public libraries….(More)”.

Unity in Privacy Diversity: A Kaleidoscopic View of Privacy Definitions


Paper by Bert-Jaap Koops and Maša Galič: “Contrary to the common claim that privacy is a concept in disarray, this Article argues that there is considerable coherence in the way privacy has been conceptualized over many decades of privacy scholarship. Seemingly disparate approaches and widely differing definitions actually share close family resemblances that, viewed together from a bird’s-eye perspective, suggest that the concept of privacy is more akin to a kaleidoscope than to a swamp. As a heuristic device to look at this kaleidoscope, we use a working definition of privacy as having spaces in which you can be yourselves, building on two major strands in privacy theory: identity-building and boundary-management. With this heuristic, we analyze how six authoritative privacy accounts can be understood in the terms and rationale of other definitions. We show how the notions of Cohen (room for boundary management), Johnson (freedom from others’ judgement), Nissenbaum (contextual integrity), Reiman (personhood), Warren and Brandeis (being let alone), and Westin (control over information) have significant overlap with—or may even be equivalent to—an understanding of privacy in terms of identity-fostering spaces. Our kaleidoscopic perspective highlights not only that there is coherence in privacy, but also helps to understand the function and value of having many different privacy definitions around: each time and context bring their own privacy-related challenges, which might best be addressed through a certain conceptualization of privacy that works in that particular context. As the world turns its kaleidoscope of emerging privacy issues, privacy scholarship turns its kaleidoscope of privacy definitions along. The result of this kaleidoscopic perspective on privacy is an illuminating picture of unity in diversity….(More)”.

Governance mechanisms for sharing of health data: An approach towards selecting attributes for complex discrete choice experiment studies


Paper by Jennifer Viberg Johansson: “Discrete Choice Experiment (DCE) is a well-established technique to elicit individual preferences, but it has rarely been used to elicit governance preferences for health data sharing.

The aim of this article was to describe the process of identifying attributes for a DCE study aiming to elicit preferences of citizens in Sweden, Iceland and the UK for governance mechanisms for digitally sharing different kinds of health data in different contexts.

A three-step approach was utilised to inform the attribute and level selection: 1) Attribute identification, 2) Attribute development and 3) Attribute refinement. First, we developed an initial set of potential attributes from a literature review and a workshop with experts. To further develop attributes, focus group discussions with citizens (n = 13), ranking exercises among focus group participants (n = 48) and expert interviews (n = 18) were performed. Thereafter, attributes were refined using group discussion (n = 3) with experts as well as cognitive interviews with citizens (n = 11).

The results led to the selection of seven attributes for further development: 1) level of identification, 2) the purpose of data use, 3) type of information, 4) consent, 5) new data user, 6) collector and 7) the oversight of data sharing. Differences were found between countries regarding the order of top three attributes. The process outlined participants’ conceptualisation of the chosen attributes, and what we learned for our attribute development phase.

This study demonstrates a process for selection of attributes for a (multi-country) DCE involving three stages: Attribute identification, Attribute development and Attribute refinement. This study can contribute to improve the ethical aspects and good practice of this phase in DCE studies. Specifically, it can contribute to the development of governance mechanisms in the digital world, where people’s health data are shared for multiple purposes….(More)”.

Retail Analytics: The Quest for Actionable Insights from Big Data on Consumer Behavior and Operational Execution


Paper by Robert P. Rooderkerk, Nicole DeHoratius, and Andres Musalem: “We document the development of academic research on retail analytics and compare it with current practice. We provide a definition of retail analytics, describe its evolution, and conduct bibliometric analyses on 123 retail analytics articles published in top operations journals in the 2000-2020 period. Our work distinguishes nine different analytical decision areas as well as types of analytics used. To document the current state of retail practice, we interviewed global practitioners, asking them to highlight their transitions from basic to advanced analytics. We conclude with a roadmap advising future research on retail analytics, including a discussion of how to better facilitate the adoption of academic work in practice….(More)”.

Politics, Public Goods, and Corporate Nudging in the HTTP/2 Standardization Process


Paper by Sylvia E. Peacock: “The goal is to map out some policy problems attached to using a club good approach instead of a public good approach to manage our internet protocols, specifically the HTTP (Hypertext Transfer Protocol). Behavioral and information economics theory are used to evaluate the standardization process of our current generation HTTP/2 (2.0). The HTTP update under scrutiny is a recently released HTTP/2 version based on Google’s SPDY, which introduces several company-specific and best practice applications, side by side. A content analysis of email discussions extracted from a publicly accessible IETF (Internet Engineering Task Force) email server shows how the club good approach of the working group leads to an underperformance in the outcomes of the standardization process. An important conclusion is that in some areas of the IETF, standardization activities may need to include public consultations, crowdsourced volunteers, or an official call for public participation to increase public oversight and more democratically manage our intangible public goods….(More)”.