Paper by Amanda Machin: “Stymied by preoccupation with short-term interests of individualist consumers, democratic institutions seem unable to generate sustained political commitment for tackling climate change. The citizens’ assembly (CA) is promoted as an important tool in combatting this “democratic myopia.” The aim of a CA is to bring together a representative group of citizens and experts from diverse backgrounds to exchange their different insights and perspectives on a complex issue. By providing the opportunity for inclusive democratic deliberation, the CA is expected to educate citizens, stimulate awareness of complex issues, and produce enlightened and legitimate policy recommendations. However, critical voices warn about the simplified and celebratory commentary surrounding the CA. Informed by agonistic and radical democratic theory, this paper elaborates on a particular concern, which is the orientation toward consensus in the CA. The paper points to the importance of disagreement in the form of both agony (from inside) and rupture (from outside) that, it is argued, is crucial for a democratic, engaging, passionate, creative, and representative sustainability politics…(More)”.
Democracy Report 2023: Defiance in the Face of Autocratization
New report by Varieties of Democracy (V-Dem): “.. the largest global dataset on democracy with over 31 million data points for 202 countries from 1789 to 2022. Involving almost 4,000 scholars and other country experts, V-Dem measures hundreds of different attributes of democracy. V-Dem enables new ways to study the nature, causes, and consequences of democracy embracing its multiple meanings. THE FIRST SECTION of the report shows global levels of democ- racy sliding back and advances made over the past 35 years diminishing. Most of the drastic changes have taken place within the last ten years, while there are large regional variations in relation to the levels of democracy people experience. The second section offers analyses on the geographies and population sizes of democratizing and autocratizing countries. In the third section we focus on the countries undergoing autocratization, and on the indicators deteriorating the most, including in relation to media censorship, repression of civil society organizations, and academic freedom. While disinformation, polarization, and autocratization reinforce each other, democracies reduce the spread of disinformation. This is a sign of hope, of better times ahead. And this is precisely the message carried forward in the fourth section, where we switch our focus to examples of countries that managed to push back and where democracy resurfaces again. Scattered over the world, these success stories share common elements that may bear implications for international democracy support and protection efforts. The final section of this year’s report offers a new perspective on shifting global balances of economic and trade power as a result of autocratization…(More)”.
Access to Data for Environmental Purposes: Setting the Scene and Evaluating Recent Changes in EU Data Law
Paper by Michèle Finck, and Marie-Sophie Mueller: “Few policy issues will be as defining to the EU’s future as its reaction to environmental decline, on the one hand, and digitalisation, on the other. Whereas the former will shape the (quality of) life and health of humans, animals and plants, the latter will define the future competitiveness of the internal market and relatedly, also societal justice and cohesion. Yet, to date, the interconnections between these issues are rarely made explicit, as evidenced by the European Commission’s current policy agendas on both matters. With this article, we hope to contribute to, ideally, a soon growing conversation about how to effectively bridge environmental protection and digitalisation. Specifically, we examine how EU law shapes the options of using data—the lifeblood of the digital economy—for environmental sustainability purposes, and ponder the impact of on-going legislative reform…(More)”.
Suspicion Machines
Lighthouse Reports: “Governments all over the world are experimenting with predictive algorithms in ways that are largely invisible to the public. What limited reporting there has been on this topic has largely focused on predictive policing and risk assessments in criminal justice systems. But there is an area where even more far-reaching experiments are underway on vulnerable populations with almost no scrutiny.
Fraud detection systems are widely deployed in welfare states ranging from complex machine learning models to crude spreadsheets. The scores they generate have potentially life-changing consequences for millions of people. Until now, public authorities have typically resisted calls for transparency, either by claiming that disclosure would increase the risk of fraud or to protect proprietary technology.
The sales pitch for these systems promises that they will recover millions of euros defrauded from the public purse. And the caricature of the benefit cheat is a modern take on the classic trope of the undeserving poor and much of the public debate in Europe — which has the most generous welfare states — is intensely politically charged.
The true extent of welfare fraud is routinely exaggerated by consulting firms, who are often the algorithm vendors, talking it up to near 5 percent of benefits spending while some national auditors’ offices estimate it at between 0.2 and 0.4 of spending. Distinguishing between honest mistakes and deliberate fraud in complex public systems is messy and hard.
When opaque technologies are deployed in search of political scapegoats the potential for harm among some of the poorest and most marginalised communities is significant.
Hundreds of thousands of people are being scored by these systems based on data mining operations where there has been scant public consultation. The consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out…(More)”.
The Expanding Use of Technology to Manage Migration
Report by Marti Flacks , Erol Yayboke , Lauren Burke and Anastasia Strouboulis: “Seeking to manage growing flows of migrants, the United States and European Union have dramatically expanded their engagement with migration origin and transit countries. This increasingly includes supporting the deployment of sophisticated technology to understand, monitor, and influence the movement of people across borders, expanding the spheres of interest to include the movement of people long before they reach U.S. and European borders.
This report from the CSIS Human Rights Initiative and CSIS Project on Fragility and Mobility examines two case studies of migration—one from Central America toward the United States and one from West and North Africa toward Europe—to map the use and export of migration management technologies and the associated human rights risks. Authors Marti Flacks, Erol Yayboke, Lauren Burke, and Anastasia Strouboulis provide recommendations for origin, transit, and destination governments on how to incorporate human rights considerations into their decisionmaking on the use of technology to manage migration…(More)”.
Foresight is a messy methodology but a marvellous mindset
Blog by Berta Mizsei: “…From my first few forays into foresight, it seemed that it employed desk research and expert workshops, but refrained from the use of data and from testing the solidity of assumptions. This can make scenarios weak and anecdotal, something experts justify by stating that scenarios are meant to be a ‘first step to start a discussion’.
The deficiencies of foresight became more evident when I took part in the process – so much of what ends up in imagined narratives depends on whether an expert was chatty during a workshop, or on the background of the expert writing the scenario.
As a young researcher coming from a quantitative background, this felt alien and alarming.
However, as it turns out, my issue was not with foresight per se, but rather with a certain way of doing it, one that is insufficiently grounded in sound research methods. In short, I am disturbed by ‘bad’ foresight. Foresight’s newly-found popularity means that there is more demand than supply for foresight experts, thus the prevalence of questionable foresight methodology has increased – something that was discussed during a dedicated session at this year’s Ideas Lab (CEPS’ flagship annual event).
One culprit is the Commission. Its foresight relies heavily on ‘backcasting’, a planning method that starts with a desirable future and works backwards to identify ways to achieve that outcome. One example is the 2022 Strategic Foresight Report ‘Twinning the green and digital transitions in the new geopolitical context’ that mapped out ways to get to the ideal future the Commission cabinet had imagined.
Is this useful? Undoubtedly.
However, it is also single-mindedly deterministic about the future of environmental policy, which is both notoriously complex and of critical importance to the current Commission. Similar hubris (or malpractice) is evident across various EU apparatuses – policymakers have a clear vision of what they want to happen and they invest into figuring out how to make that a reality without admitting how turbulent and unpredictable the future is. This is commendable and politically advantageous… but it is not foresight.
It misses one of foresight’s main virtues: forcing us to consider alternative futures…(More)”.
Innovation Power: Why Technology Will Define the Future of Geopolitics
Essay by Eric Schmidt: “When Russian forces marched on Kyiv in February 2022, few thought Ukraine could survive. Russia had more than twice as many soldiers as Ukraine. Its military budget was more than ten times as large. The U.S. intelligence community estimated that Kyiv would fall within one to two weeks at most.
Outgunned and outmanned, Ukraine turned to one area in which it held an advantage over the enemy: technology. Shortly after the invasion, the Ukrainian government uploaded all its critical data to the cloud, so that it could safeguard information and keep functioning even if Russian missiles turned its ministerial offices into rubble. The country’s Ministry of Digital Transformation, which Ukrainian President Volodymyr Zelensky had established just two years earlier, repurposed its e-government mobile app, Diia, for open-source intelligence collection, so that citizens could upload photos and videos of enemy military units. With their communications infrastructure in jeopardy, the Ukrainians turned to Starlink satellites and ground stations provided by SpaceX to stay connected. When Russia sent Iranian-made drones across the border, Ukraine acquired its own drones specially designed to intercept their attacks—while its military learned how to use unfamiliar weapons supplied by Western allies. In the cat-and-mouse game of innovation, Ukraine simply proved nimbler. And so what Russia had imagined would be a quick and easy invasion has turned out to be anything but.
Ukraine’s success can be credited in part to the resolve of the Ukrainian people, the weakness of the Russian military, and the strength of Western support. But it also owes to the defining new force of international politics: innovation power. Innovation power is the ability to invent, adopt, and adapt new technologies. It contributes to both hard and soft power. High-tech weapons systems increase military might, new platforms and the standards that govern them provide economic leverage, and cutting-edge research and technologies enhance global appeal. There is a long tradition of states harnessing innovation to project power abroad, but what has changed is the self-perpetuating nature of scientific advances. Developments in artificial intelligence in particular not only unlock new areas of scientific discovery; they also speed up that very process. Artificial intelligence supercharges the ability of scientists and engineers to discover ever more powerful technologies, fostering advances in artificial intelligence itself as well as in other fields—and reshaping the world in the process…(More)”.
Health data justice: building new norms for health data governance
Paper by James Shaw & Sharifah Sekalala: “The retention and use of health-related data by government, corporate, and health professional actors risk exacerbating the harms of colonial systems of inequality in which health care and public health are situated, regardless of the intentions about how those data are used. In this context, a data justice perspective presents opportunities to develop new norms of health-related data governance that hold health justice as the primary objective. In this perspective, we define the concept of health data justice, outline urgent issues informed by this approach, and propose five calls to action from a health data justice perspective…(More)”.
Mapping and Comparing Data Governance Frameworks: A benchmarking exercise to inform global data governance deliberations
Paper by Sara Marcucci, Natalia Gonzalez Alarcon, Stefaan G. Verhulst, Elena Wullhorst: “Data has become a critical resource for organizations and society. Yet, it is not always as valuable as it could be since there is no well-defined approach to managing and using it. This article explores the increasing importance of global data governance due to the rapid growth of data and the need for responsible data use and protection. While historically associated with private organizational governance, data governance has evolved to include governmental and institutional bodies. However, the lack of a global consensus and fragmentation in policies and practices pose challenges to the development of a common framework. The purpose of this report is to compare approaches and identify patterns in the emergent and fragmented data governance ecosystem within sectors close to the international development field, ultimately presenting key takeaways and reflections on when and why a global data governance framework may be needed. Overall, the report highlights the need for a more holistic, coordinated transnational approach to data governance to manage the global flow of data responsibly and for the public interest. The article begins by giving an overview of the current fragmented data governance ecology, to then proceed to illustrate the methodology used. Subsequently, the paper illustrates the most relevant findings stemming from the research. These are organized according to six key elements: (a) purpose, (b) principles, (c) anchoring documents, (d) data description and lifecycle, (e) processes, and (f) practices. Finally, the article closes with a series of key takeaways and final reflections…(More)”.
Big data for whom? Data-driven estimates to prioritize the recovery needs of vulnerable populations after a disaster
Blog and paper by Sabine Loos and David Lallemant: “For years, international agencies have been effusing the benefits of big data for sustainable development. Emerging technology–such as crowdsourcing, satellite imagery, and machine learning–have the power to better inform decision-making, especially those that support the 17 Sustainable Development Goals. When a disaster occurs, overwhelming amounts of big data from emerging technology are produced with the intention to support disaster responders. We are seeing this now with the recent earthquakes in Turkey and Syria: space agencies are processing satellite imagery to map faults and building damage or digital humanitarians are crowdsourcing baseline data like roads and buildings.
Eight years ago, the Nepal 2015 earthquake was no exception–emergency managers received maps of shaking or crowdsourced maps of affected people’s needs from diverse sources. A year later, I began research with a team of folks involved during the response to the earthquake, and I was determined to understand how big data produced after disasters were connected to the long-term effects of the earthquake. Our research team found that a lot of data that was used to guide the recovery focused on building damage, which was often viewed as a proxy for population needs. While building damage information is useful, it does not capture the full array of social, environmental, and physical factors that will lead to disparities in long-term recovery. I assumed information would have been available immediately after the earthquake that was aimed at supporting vulnerable populations. However, as I spent time in Nepal during the years after the 2015 earthquake, speaking with government officials and nongovernmental organizations involved in the response and recovery, I found they lacked key information about the needs of the most vulnerable households–those who would face the greatest obstacles during the recovery from the earthquake. While governmental and nongovernmental actors prioritized the needs of vulnerable households as best as possible with the information available, I was inspired to pursue research that could provide better information more quickly after an earthquake, to inform recovery efforts.
In our paper published in Communications Earth and Environment [link], we develop a data-driven approach to rapidly estimate which areas are likely to fall behind during recovery due to physical, environmental, and social obstacles. This approach leverages survey data on recovery progress combined with geospatial datasets that would be readily available after an event that represent factors expected to impede recovery. To identify communities with disproportionate needs long after a disaster, we propose focusing on those who fall behind in recovery over time, or non-recovery. We focus on non-recovery since it places attention on those who do not recover rather than delineating the characteristics of successful recovery. In addition, in speaking to several groups in Nepal involved in the recovery, they understood vulnerability–a concept that is place-based and can change over time–as those who would not be able to recover due to the earthquake…(More)”