Nudging: A Tool to Influence Human Behavior in Health Policy


Book by František Ochrana and Radek Kovács: “Behavioral economics sees “nudges” as ways to encourage people to re-evaluate their priorities in such a way that they voluntarily change their behavior, leading to personal and social benefits. This book examines nudging as a tool for influencing human behavior in health policy. The authors investigate the contemporary scientific discourse on nudging and enrich it with an ontological, epistemological, and praxeological analysis of human behavior. Based on analyses of the literature and a systemic review, the book defines nudging tools within the paradigm of prospect theory. In addition to the theoretical contribution, Nudging also examines and offers suggestions on the practice of health policy regarding obesity, malnutrition, and especially type 2 diabetes mellitus…(More)”.

The Future of Compute


Independent Review by a UK Expert Panel: “…Compute is a material part of modern life. It is among the critical technologies lying behind innovation, economic growth and scientific discoveries. Compute improves our everyday lives. It underpins all the tools, services and information we hold on our handheld devices – from search engines and social media, to streaming services and accurate weather forecasts. This technology may be invisible to the public, but life today would be very different without it.

Sectors across the UK economy, both new and old, are increasingly reliant upon compute. By leveraging the capability that compute provides, businesses of all sizes can extract value from the enormous quantity of data created every day; reduce the cost and time required for research and development (R&D); improve product design; accelerate decision making processes; and increase overall efficiency. Compute also enables advancements in transformative technologies, such as AI, which themselves lead to the creation of value and innovation across the economy. This all translates into higher productivity and profitability for businesses and robust economic growth for the UK as a whole.

Compute powers modelling, simulations, data analysis and scenario planning, and thereby enables researchers to develop new drugs; find new energy sources; discover new materials; mitigate the effects of climate change; and model the spread of pandemics. Compute is required to tackle many of today’s global challenges and brings invaluable benefits to our society.

Compute’s effects on society and the economy have already been and, crucially, will continue to be transformative. The scale of compute capabilities keeps accelerating at pace. The performance of the world’s fastest compute has grown by a factor of 626 since 2010. The compute requirements of the largest machine learning models has grown 10 billion times over the last 10 years. We expect compute demand to significantly grow as compute capability continues to increase. Technology today operates very differently to 10 years ago and, in a decade’s time, it will have changed once again.

Yet, despite compute’s value to the economy and society, the UK lacks a long-term vision for compute…(More)”.

Access to Data for Environmental Purposes: Setting the Scene and Evaluating Recent Changes in EU Data Law


Paper by Michèle Finck, and Marie-Sophie Mueller: “Few policy issues will be as defining to the EU’s future as its reaction to environmental decline, on the one hand, and digitalisation, on the other. Whereas the former will shape the (quality of) life and health of humans, animals and plants, the latter will define the future competitiveness of the internal market and relatedly, also societal justice and cohesion. Yet, to date, the interconnections between these issues are rarely made explicit, as evidenced by the European Commission’s current policy agendas on both matters. With this article, we hope to contribute to, ideally, a soon growing conversation about how to effectively bridge environmental protection and digitalisation. Specifically, we examine how EU law shapes the options of using data—the lifeblood of the digital economy—for environmental sustainability purposes, and ponder the impact of on-going legislative reform…(More)”.

Suspicion Machines


Lighthouse Reports: “Governments all over the world are experimenting with predictive algorithms in ways that are largely invisible to the public. What limited reporting there has been on this topic has largely focused on predictive policing and risk assessments in criminal justice systems. But there is an area where even more far-reaching experiments are underway on vulnerable populations with almost no scrutiny.

Fraud detection systems are widely deployed in welfare states ranging from complex machine learning models to crude spreadsheets. The scores they generate have potentially life-changing consequences for millions of people. Until now, public authorities have typically resisted calls for transparency, either by claiming that disclosure would increase the risk of fraud or to protect proprietary technology.

The sales pitch for these systems promises that they will recover millions of euros defrauded from the public purse. And the caricature of the benefit cheat is a modern take on the classic trope of the undeserving poor and much of the public debate in Europe — which has the most generous welfare states — is intensely politically charged.

The true extent of welfare fraud is routinely exaggerated by consulting firms, who are often the algorithm vendors, talking it up to near 5 percent of benefits spending while some national auditors’ offices estimate it at between 0.2 and 0.4 of spending. Distinguishing between honest mistakes and deliberate fraud in complex public systems is messy and hard.

When opaque technologies are deployed in search of political scapegoats the potential for harm among some of the poorest and most marginalised communities is significant.

Hundreds of thousands of people are being scored by these systems based on data mining operations where there has been scant public consultation. The consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out…(More)”.

The Expanding Use of Technology to Manage Migration


Report by ​Marti Flacks , Erol Yayboke , Lauren Burke and Anastasia Strouboulis: “Seeking to manage growing flows of migrants, the United States and European Union have dramatically expanded their engagement with migration origin and transit countries. This increasingly includes supporting the deployment of sophisticated technology to understand, monitor, and influence the movement of people across borders, expanding the spheres of interest to include the movement of people long before they reach U.S. and European borders.

This report from the CSIS Human Rights Initiative and CSIS Project on Fragility and Mobility examines two case studies of migration—one from Central America toward the United States and one from West and North Africa toward Europe—to map the use and export of migration management technologies and the associated human rights risks. Authors Marti Flacks, Erol Yayboke, Lauren Burke, and Anastasia Strouboulis provide recommendations for origin, transit, and destination governments on how to incorporate human rights considerations into their decisionmaking on the use of technology to manage migration…(More)”.

Examining public views on decentralised health data sharing


Paper by Victoria Neumann et al: “In recent years, researchers have begun to explore the use of Distributed Ledger Technologies (DLT), also known as blockchain, in health data sharing contexts. However, there is a significant lack of research that examines public attitudes towards the use of this technology. In this paper, we begin to address this issue and present results from a series of focus groups which explored public views and concerns about engaging with new models of personal health data sharing in the UK. We found that participants were broadly in favour of a shift towards new decentralised models of data sharing. Retaining ‘proof’ of health information stored about patients and the capacity to provide permanent audit trails, enabled by immutable and transparent properties of DLT, were regarded as particularly valuable for our participants and prospective data custodians. Participants also identified other potential benefits such as supporting people to become more health data literate and enabling patients to make informed decisions about how their data was shared and with whom. However, participants also voiced concerns about the potential to further exacerbate existing health and digital inequalities. Participants were also apprehensive about the removal of intermediaries in the design of personal health informatics systems…(More)”.

Toward a 21st Century National Data Infrastructure: Mobilizing Information for the Common Good


Report by National Academies of Sciences, Engineering, and Medicine: “Historically, the U.S. national data infrastructure has relied on the operations of the federal statistical system and the data assets that it holds. Throughout the 20th century, federal statistical agencies aggregated survey responses of households and businesses to produce information about the nation and diverse subpopulations. The statistics created from such surveys provide most of what people know about the well-being of society, including health, education, employment, safety, housing, and food security. The surveys also contribute to an infrastructure for empirical social- and economic-sciences research. Research using survey-response data, with strict privacy protections, led to important discoveries about the causes and consequences of important societal challenges and also informed policymakers. Like other infrastructure, people can easily take these essential statistics for granted. Only when they are threatened do people recognize the need to protect them…(More)”.

The Keys to Democracy: Sortition as a New Model for Citizen Power


Book by Maurice Pope: “Sortition — also known as random selection — puts ordinary people in control of decision-making in government. This may seem novel, but it is how the original Athenian democracy worked. In fact, what is new is our belief that electoral systems are democratic. It was self-evident to thinkers from Aristotle to the Renaissance that elections always resulted in oligarchies, or rule by elites.

In this distillation of a lifetime’s thinking about the history and principles of democracy, Maurice Pope presents a new model of governance that replaces elected politicians with assemblies selected by lot. The re-introduction of sortition, he believes, offers a way out of gridlock, apathy, alienation and polarisation by giving citizens back their voice.

Pope’s work — published posthumously — grew from his unique perspective as a widely travelled English classicist who also experienced the injustice of apartheid rule in South Africa. His great mind was as much at home with the history of philosophy as the mathematics of probability.

Governments and even the EU have tried out sortition in recent years; the UK, France and several countries have attempted to tackle climate change through randomly selected citizens’ assemblies. The city of Paris and the German-speaking community of Belgium have set up permanent upper houses chosen by lot. Several hundred such experiments around the world are challenging the assumption that elections are the only or ideal route to credible, effective government.

Writing before these mostly advisory bodies took shape, Pope lays out a vision for a government entirely based on random selection and citizen deliberation. In arguing for this more radical goal, he draws on the glories of ancient Athens, centuries of use in Venice, the success of randomly selected juries and the philosophical advantages of randomness. Sortition-based democracy, he believed, is the only plausible way to achieve each element of Abraham Lincoln’s call for a democratic government “of the people, by the people, for the people”…(More)”.

Foresight is a messy methodology but a marvellous mindset


Blog by Berta Mizsei: “…From my first few forays into foresight, it seemed that it employed desk research and expert workshops, but refrained from the use of data and from testing the solidity of assumptions. This can make scenarios weak and anecdotal, something experts justify by stating that scenarios are meant to be a ‘first step to start a discussion’.

The deficiencies of foresight became more evident when I took part in the process – so much of what ends up in imagined narratives depends on whether an expert was chatty during a workshop, or on the background of the expert writing the scenario.

As a young researcher coming from a quantitative background, this felt alien and alarming.

However, as it turns out, my issue was not with foresight per se, but rather with a certain way of doing it, one that is insufficiently grounded in sound research methods. In short, I am disturbed by ‘bad’ foresight. Foresight’s newly-found popularity means that there is more demand than supply for foresight experts, thus the prevalence of questionable foresight methodology has increased – something that was discussed during a dedicated session at this year’s Ideas Lab (CEPS’ flagship annual event).

One culprit is the Commission. Its foresight relies heavily on ‘backcasting’, a planning method that starts with a desirable future and works backwards to identify ways to achieve that outcome. One example is the 2022 Strategic Foresight Report ‘Twinning the green and digital transitions in the new geopolitical context’ that mapped out ways to get to the ideal future the Commission cabinet had imagined.

Is this useful? Undoubtedly.

However, it is also single-mindedly deterministic about the future of environmental policy, which is both notoriously complex and of critical importance to the current Commission. Similar hubris (or malpractice) is evident across various EU apparatuses – policymakers have a clear vision of what they want to happen and they invest into figuring out how to make that a reality without admitting how turbulent and unpredictable the future is. This is commendable and politically advantageous… but it is not foresight.

It misses one of foresight’s main virtues: forcing us to consider alternative futures…(More)”.

Why Does Open Data Get Underused? A Focus on the Role of (Open) Data Literacy


Paper by Gema Santos-Hermosa et al: “Open data has been conceptualised as a strategic form of public knowledge. Tightly connected with the developments in open government and open science, the main claim is that access to open data (OD) might be a catalyser of social innovation and citizen empowerment. Nevertheless, the so-called (open) data divide, as a problem connected to the situation of OD usage and engagement, is a concern.

In this chapter, we introduce the OD usage trends, focusing on the role played by (open) data literacy amongst either users or producers: citizens, professionals, and researchers. Indeed, we attempted to cover the problem of OD through a holistic approach including two areas of research and practice: open government data (OGD) and open research data (ORD). After uncovering several factors blocking OD consumption, we point out that more OD is being published (albeit with low usage), and we overview the research on data literacy. While the intentions of stakeholders are driven by many motivations, the abilities that put them in the condition to enhance OD might require further attention. In the end, we focus on several lifelong learning activities supporting open data literacy, uncovering the challenges ahead to unleash the power of OD in society…(More)”.