The Sensitive Politics Of Information For Digital States


Essay by Federica Carugati, Cyanne E. Loyle and Jessica Steinberg: “In 2020, Vice revealed that the U.S. military had signed a contract with Babel Street, a Virginia-based company that created a product called Locate X, which collects location data from users across a variety of digital applications. Some of these apps are seemingly innocuous: one for following storms, a Muslim dating app and a level for DIY home repair. Less innocuously, these reports indicate that the U.S. government is outsourcing some of its counterterrorism and counterinsurgency information-gathering activities to a private company.

While states have always collected information about citizens and their activities, advances in digital technologies — including new kinds of data and infrastructure — have fundamentally altered their ability to access, gather and analyze information. Bargaining with and relying on non-state actors like private companies creates tradeoffs between a state’s effectiveness and legitimacy. Those tradeoffs might be unacceptable to citizens, undermining our very understanding of what states do and how we should interact with them …(More)”

Whole of government innovation


Report by Geoff Mulgan: ‘Whole of government’ approaches – that aim to mobilise and align many ministries and agencies around a common challenge – have a long history. There have been notable examples during major wars, and around attempts to digitize societies, to cut energy use and to respond to the COVID-19 pandemic.

This paper has been prepared as part of a European Commission programme which I’m chairing looking at ‘whole of government innovation’ and working with national governments to help them better align their actions.

My paper – linked below – looks at the lessons of history. It outlines the many tools governments can use to achieve cross-cutting goals, linking R&D to law, regulation and procurement, and collaborating with business, universities and civil society. It argues that it is unwise to rely only on committees and boards. It shows how these choices link to innovation strategy and funding, including the relevance of half a century of experiment with moon-shots and missions.

The paper describes how the organisational challenges vary depending on the nature of the task; why governments need to avoid common technology or ‘STI trap’, of focusing only on hardware and not on social arrangements or business models; why constellations and flotillas of coordination are usually more realistic than true ‘whole of government approaches; the importance of mobilising hearts and minds as well as money and command.

Finally, it addresses the relevance of different approaches to current tasks such as the achievement of a net zero economy and society. The paper is shared as a working document – I’m keen to find new examples and approaches…(More)”.

Collaborative Advantage: Creating Global Commons for Science, Technology, and Innovation


Essay by Leonard Lynn and Hal Salzman: “…We argue that abandoning this techno-nationalist approach and instead investing in systems of global innovation commons, modeled on successful past experiences, and developing new principles and policies for collaborative STI could bring substantially greater benefits—not only for the world, but specifically for the United States. Key to this effort will be creating systems of governance that enable nations to contribute to the commons and to benefit from its innovations, while also allowing each country substantial freedom of action…

The competitive and insular tone of contemporary discourse about STI stands in contrast to our era’s most urgent challenges, which are global in scale: the COVID-19 pandemic, climate change, and governance of complex emerging technologies such as gene editing and artificial intelligence. These global challenges, we believe, require resources, scientific understanding, and know-how that can best be developed through common resource pools to enable both global scale and rapid dissemination. Moreover, aside from moral or ethical considerations about sharing such innovations, the reality of current globalization means that solutions—such as pandemic vaccines—must spread beyond national borders to fully benefit the world. Consequently, each separate national interest will be better served by collaboratively building up the global stocks of STI as public goods. Global scientific commons could be vital in addressing these challenges, but will require new frameworks for governance that are fair and attractive to many nations while also enabling them to act individually.

A valuable perspective on the governance of common pool resources (CPR) can be found in the work that Nobel laureate Elinor Ostrom did with her colleagues beginning in the 1950s. Ostrom, a political scientist, studied how communities that must share common resources—water, fisheries, or grazing land—use trust, cooperation, and collective deliberation to manage those resources over the long term. Before Ostrom’s work, many economists believed that shared resource systems were inherently unsustainable because individuals acting in their own self-interest would ultimately undermine the good of the group, often described as “the tragedy of the commons.” Instead, Ostrom demonstrated that communities can create durable “practical algorithms” for sharing pooled resources, whether that be irrigation in Nepal or lobster fishing in Maine…(More)”.

The big idea: should governments run more experiments?


Article by Stian Westlake: “…Conceived in haste in the early days of the pandemic, Recovery (which stands for Randomised Evaluation of Covid-19 Therapy) sought to find drugs to help treat people seriously ill with the novel disease. It brought together epidemiologists, statisticians and health workers to test a range of promising existing drugs at massive scale across the NHS.

The secret of Recovery’s success is that it was a series of large, fast, randomised experiments, designed to be as easy as possible for doctors and nurses to administer in the midst of a medical emergency. And it worked wonders: within three months, it had demonstrated that dexamethasone, a cheap and widely available steroid, reduced Covid deaths by a fifth to a third. In the months that followed, Recovery identified four more effective drugs, and along the way showed that various popular treatments, including hydroxychloroquine, President Trump’s tonic of choice, were useless. All in all, it is thought that Recovery saved a million lives around the world, and it’s still going.

But Recovery’s incredible success should prompt us to ask a more challenging question: why don’t we do this more often? The question of which drugs to use was far from the only unknown we had to navigate in the early days of the pandemic. Consider the decision to delay second doses of the vaccine, when to close schools, or the right regime for Covid testing. In each case, the UK took a calculated risk and hoped for the best. But as the Royal Statistical Society pointed out at the time, it would have been cheap and quick to undertake trials so we could know for sure what the right choice was, and then double down on it.

There is a growing movement to apply randomised trials not just in healthcare but in other things government does. ..(More)”.

Haste: The Slow Politics of Climate Urgency


Book edited by Håvard Haarstad, Jakob Grandin, Kristin Kjærås, and Eleanor Johnson: “It’s understandable that we tend to present climate change as something urgently requiring action. Every day we fail to act, the potential for catastrophe grows. But is that framing itself a problem?  When we hurry, we make more mistakes. We overlook things. We get tunnel vision.

  In Haste, a group of distinguished contributors makes the case for a slow politics of urgency. Rather than rushing and speeding up, he argues, the sustainable future is better served by our challenging of the dominant framings through which we understand time and change in society. While recognizing the need for certain types of urgency in climate politics, Haste directs attention to the different and alternative temporalities at play in climate and sustainability politics. Divided into short and accessible chapters, written by both established and emerging scholars from different disciplines, Haste tackles a major problem in contemporary climate change research and offers creative perspectives on pathways out of the climate emergency…(More)”

An iterative regulatory process for robot governance


Paper by Hadassah Drukarch, Carlos Calleja and Eduard Fosch-Villaronga: “There is an increasing gap between the policy cycle’s speed and that of technological and social change. This gap is becoming broader and more prominent in robotics, that is, movable machines that perform tasks either automatically or with a degree of autonomy. This is because current legislation was unprepared for machine learning and autonomous agents. As a result, the law often lags behind and does not adequately frame robot technologies. This state of affairs inevitably increases legal uncertainty. It is unclear what regulatory frameworks developers have to follow to comply, often resulting in technology that does not perform well in the wild, is unsafe, and can exacerbate biases and lead to discrimination. This paper explores these issues and considers the background, key findings, and lessons learned of the LIAISON project, which stands for “Liaising robot development and policymaking,” and aims to ideate an alignment model for robots’ legal appraisal channeling robot policy development from a hybrid top-down/bottom-up perspective to solve this mismatch. As such, LIAISON seeks to uncover to what extent compliance tools could be used as data generators for robot policy purposes to unravel an optimal regulatory framing for existing and emerging robot technologies…(More)”.

How the Digital Transformation Changed Geopolitics


Paper by Dan Ciuriak: “In the late 2000s, a set of connected technological developments – introduction of the iPhone, deep learning through stacked neural nets, and application of GPUs to neural nets – resulted in the generation of truly astronomical amounts of data and provided the tools to exploit it. As the world emerged from the Great Financial Crisis of 2008-2009, data was decisively transformed from a mostly valueless by-product – “data exhaust” – to the “new oil”, the essential capital asset of the data-driven economy, and the “new plutonium” when deployed in social and political applications. This economy featured steep economies of scale, powerful economies of scope, network externalities in many applications, and pervasive information asymmetry. Strategic commercial policies at the firm and national levels were incentivized by the newfound scope to capture economic rents, destabilizing the rules-based system for trade and investment. At the same time, the new disruptive general-purpose technologies built on the nexus of Big Data, machine learning and artificial intelligence reconfigured geopolitical rivalry in several ways: by shifting great power rivalry onto new and critical grounds on which none had a decisive established advantage; by creating new vulnerabilities to information warfare in societies, especially open societies; and by enhancing the tools for social manipulation and the promotion of political personality cults. Machine learning, which essentially industrialized the very process of learning, drove an acceleration in the pace of innovation, which precipitated industrial policies driven by the desire to capture first mover advantage and by the fear of falling behind.

These developments provide a unifying framework to understand the progressive unravelling of the US-led global system as the decade unfolded, despite the fact that all the major innovations that drove the transition were within the US sphere and the US enjoyed first mover advantages. This is in stark contrast to the previous major economic transition to the knowledge-based economy, in which case US leadership on the key innovations extended its dominance for decades and indeed powered its rise to its unipolar moment. The world did not respond well to the changed technological and economic conditions and hence we are war: hot war, cold war, technological war, trade war, social war, and internecine political war. This paper focuses on the role of technological and economic conditions in shaping geopolitics, which is critical to understand if we are to respond to the current world disorder and to prepare to handle the coming transition in technological and economic conditions to yet another new economic era based on machine knowledge capital…(More)”.

Democracy Report 2023: Defiance in the Face of Autocratization


New report by Varieties of Democracy (V-Dem): “.. the largest global dataset on democracy with over 31 million data points for 202 countries from 1789 to 2022. Involving almost 4,000 scholars and other country experts, V-Dem measures hundreds of different attributes of democracy. V-Dem enables new ways to study the nature, causes, and consequences of democracy embracing its multiple meanings. THE FIRST SECTION of the report shows global levels of democ- racy sliding back and advances made over the past 35 years diminishing. Most of the drastic changes have taken place within the last ten years, while there are large regional variations in relation to the levels of democracy people experience. The second section offers analyses on the geographies and population sizes of democratizing and autocratizing countries. In the third section we focus on the countries undergoing autocratization, and on the indicators deteriorating the most, including in relation to media censorship, repression of civil society organizations, and academic freedom. While disinformation, polarization, and autocratization reinforce each other, democracies reduce the spread of disinformation. This is a sign of hope, of better times ahead. And this is precisely the message carried forward in the fourth section, where we switch our focus to examples of countries that managed to push back and where democracy resurfaces again. Scattered over the world, these success stories share common elements that may bear implications for international democracy support and protection efforts. The final section of this year’s report offers a new perspective on shifting global balances of economic and trade power as a result of autocratization…(More)”.

When Ideology Drives Social Science


Article by Michael Jindra and Arthur Sakamoto: Last summer in these pages, Mordechai Levy-Eichel and Daniel Scheinerman uncovered a major flaw in Richard Jean So’s Redlining Culture: A Data History of Racial Inequality and Postwar Fiction, one that rendered the book’s conclusion null and void. Unfortunately, what they found was not an isolated incident. In complex areas like the study of racial inequality, a fundamentalism has taken hold that discourages sound methodology and the use of reliable evidence about the roots of social problems.

We are not talking about mere differences in interpretation of results, which are common. We are talking about mistakes so clear that they should cause research to be seriously questioned or even disregarded. A great deal of research — we will focus on examinations of Asian American class mobility — rigs its statistical methods in order to arrive at ideologically preferred conclusions.

Most sophisticated quantitative work in sociology involves multivariate research, often in a search for causes of social problems. This work might ask how a particular independent variable (e.g., education level) “causes” an outcome or dependent variable (e.g., income). Or it could study the reverse: How does parental income influence children’s education?

Human behavior is too complicated to be explained by only one variable, so social scientists typically try to “control” for various causes simultaneously. If you are trying to test for a particular cause, you want to isolate that cause and hold all other possible causes constant. One can control for a given variable using what is called multiple regression, a statistical tool that parcels out the separate net effects of several variables simultaneously.

If you want to determine whether income causes better education outcomes, you’d want to compare everyone from a two-parent family, since family status might be another causal factor, for instance. You’d also want to see the effect of family status by comparing everyone with similar incomes. And so on for other variables.

The problem is that there are potentially so many variables that a researcher inevitably leaves some out…(More)”.

Nudging: A Tool to Influence Human Behavior in Health Policy


Book by František Ochrana and Radek Kovács: “Behavioral economics sees “nudges” as ways to encourage people to re-evaluate their priorities in such a way that they voluntarily change their behavior, leading to personal and social benefits. This book examines nudging as a tool for influencing human behavior in health policy. The authors investigate the contemporary scientific discourse on nudging and enrich it with an ontological, epistemological, and praxeological analysis of human behavior. Based on analyses of the literature and a systemic review, the book defines nudging tools within the paradigm of prospect theory. In addition to the theoretical contribution, Nudging also examines and offers suggestions on the practice of health policy regarding obesity, malnutrition, and especially type 2 diabetes mellitus…(More)”.