Collaborative Advantage: Creating Global Commons for Science, Technology, and Innovation


Essay by Leonard Lynn and Hal Salzman: “…We argue that abandoning this techno-nationalist approach and instead investing in systems of global innovation commons, modeled on successful past experiences, and developing new principles and policies for collaborative STI could bring substantially greater benefits—not only for the world, but specifically for the United States. Key to this effort will be creating systems of governance that enable nations to contribute to the commons and to benefit from its innovations, while also allowing each country substantial freedom of action…

The competitive and insular tone of contemporary discourse about STI stands in contrast to our era’s most urgent challenges, which are global in scale: the COVID-19 pandemic, climate change, and governance of complex emerging technologies such as gene editing and artificial intelligence. These global challenges, we believe, require resources, scientific understanding, and know-how that can best be developed through common resource pools to enable both global scale and rapid dissemination. Moreover, aside from moral or ethical considerations about sharing such innovations, the reality of current globalization means that solutions—such as pandemic vaccines—must spread beyond national borders to fully benefit the world. Consequently, each separate national interest will be better served by collaboratively building up the global stocks of STI as public goods. Global scientific commons could be vital in addressing these challenges, but will require new frameworks for governance that are fair and attractive to many nations while also enabling them to act individually.

A valuable perspective on the governance of common pool resources (CPR) can be found in the work that Nobel laureate Elinor Ostrom did with her colleagues beginning in the 1950s. Ostrom, a political scientist, studied how communities that must share common resources—water, fisheries, or grazing land—use trust, cooperation, and collective deliberation to manage those resources over the long term. Before Ostrom’s work, many economists believed that shared resource systems were inherently unsustainable because individuals acting in their own self-interest would ultimately undermine the good of the group, often described as “the tragedy of the commons.” Instead, Ostrom demonstrated that communities can create durable “practical algorithms” for sharing pooled resources, whether that be irrigation in Nepal or lobster fishing in Maine…(More)”.

The big idea: should governments run more experiments?


Article by Stian Westlake: “…Conceived in haste in the early days of the pandemic, Recovery (which stands for Randomised Evaluation of Covid-19 Therapy) sought to find drugs to help treat people seriously ill with the novel disease. It brought together epidemiologists, statisticians and health workers to test a range of promising existing drugs at massive scale across the NHS.

The secret of Recovery’s success is that it was a series of large, fast, randomised experiments, designed to be as easy as possible for doctors and nurses to administer in the midst of a medical emergency. And it worked wonders: within three months, it had demonstrated that dexamethasone, a cheap and widely available steroid, reduced Covid deaths by a fifth to a third. In the months that followed, Recovery identified four more effective drugs, and along the way showed that various popular treatments, including hydroxychloroquine, President Trump’s tonic of choice, were useless. All in all, it is thought that Recovery saved a million lives around the world, and it’s still going.

But Recovery’s incredible success should prompt us to ask a more challenging question: why don’t we do this more often? The question of which drugs to use was far from the only unknown we had to navigate in the early days of the pandemic. Consider the decision to delay second doses of the vaccine, when to close schools, or the right regime for Covid testing. In each case, the UK took a calculated risk and hoped for the best. But as the Royal Statistical Society pointed out at the time, it would have been cheap and quick to undertake trials so we could know for sure what the right choice was, and then double down on it.

There is a growing movement to apply randomised trials not just in healthcare but in other things government does. ..(More)”.

Haste: The Slow Politics of Climate Urgency


Book edited by Håvard Haarstad, Jakob Grandin, Kristin Kjærås, and Eleanor Johnson: “It’s understandable that we tend to present climate change as something urgently requiring action. Every day we fail to act, the potential for catastrophe grows. But is that framing itself a problem?  When we hurry, we make more mistakes. We overlook things. We get tunnel vision.

  In Haste, a group of distinguished contributors makes the case for a slow politics of urgency. Rather than rushing and speeding up, he argues, the sustainable future is better served by our challenging of the dominant framings through which we understand time and change in society. While recognizing the need for certain types of urgency in climate politics, Haste directs attention to the different and alternative temporalities at play in climate and sustainability politics. Divided into short and accessible chapters, written by both established and emerging scholars from different disciplines, Haste tackles a major problem in contemporary climate change research and offers creative perspectives on pathways out of the climate emergency…(More)”

An iterative regulatory process for robot governance


Paper by Hadassah Drukarch, Carlos Calleja and Eduard Fosch-Villaronga: “There is an increasing gap between the policy cycle’s speed and that of technological and social change. This gap is becoming broader and more prominent in robotics, that is, movable machines that perform tasks either automatically or with a degree of autonomy. This is because current legislation was unprepared for machine learning and autonomous agents. As a result, the law often lags behind and does not adequately frame robot technologies. This state of affairs inevitably increases legal uncertainty. It is unclear what regulatory frameworks developers have to follow to comply, often resulting in technology that does not perform well in the wild, is unsafe, and can exacerbate biases and lead to discrimination. This paper explores these issues and considers the background, key findings, and lessons learned of the LIAISON project, which stands for “Liaising robot development and policymaking,” and aims to ideate an alignment model for robots’ legal appraisal channeling robot policy development from a hybrid top-down/bottom-up perspective to solve this mismatch. As such, LIAISON seeks to uncover to what extent compliance tools could be used as data generators for robot policy purposes to unravel an optimal regulatory framing for existing and emerging robot technologies…(More)”.

How the Digital Transformation Changed Geopolitics


Paper by Dan Ciuriak: “In the late 2000s, a set of connected technological developments – introduction of the iPhone, deep learning through stacked neural nets, and application of GPUs to neural nets – resulted in the generation of truly astronomical amounts of data and provided the tools to exploit it. As the world emerged from the Great Financial Crisis of 2008-2009, data was decisively transformed from a mostly valueless by-product – “data exhaust” – to the “new oil”, the essential capital asset of the data-driven economy, and the “new plutonium” when deployed in social and political applications. This economy featured steep economies of scale, powerful economies of scope, network externalities in many applications, and pervasive information asymmetry. Strategic commercial policies at the firm and national levels were incentivized by the newfound scope to capture economic rents, destabilizing the rules-based system for trade and investment. At the same time, the new disruptive general-purpose technologies built on the nexus of Big Data, machine learning and artificial intelligence reconfigured geopolitical rivalry in several ways: by shifting great power rivalry onto new and critical grounds on which none had a decisive established advantage; by creating new vulnerabilities to information warfare in societies, especially open societies; and by enhancing the tools for social manipulation and the promotion of political personality cults. Machine learning, which essentially industrialized the very process of learning, drove an acceleration in the pace of innovation, which precipitated industrial policies driven by the desire to capture first mover advantage and by the fear of falling behind.

These developments provide a unifying framework to understand the progressive unravelling of the US-led global system as the decade unfolded, despite the fact that all the major innovations that drove the transition were within the US sphere and the US enjoyed first mover advantages. This is in stark contrast to the previous major economic transition to the knowledge-based economy, in which case US leadership on the key innovations extended its dominance for decades and indeed powered its rise to its unipolar moment. The world did not respond well to the changed technological and economic conditions and hence we are war: hot war, cold war, technological war, trade war, social war, and internecine political war. This paper focuses on the role of technological and economic conditions in shaping geopolitics, which is critical to understand if we are to respond to the current world disorder and to prepare to handle the coming transition in technological and economic conditions to yet another new economic era based on machine knowledge capital…(More)”.

Democracy Report 2023: Defiance in the Face of Autocratization


New report by Varieties of Democracy (V-Dem): “.. the largest global dataset on democracy with over 31 million data points for 202 countries from 1789 to 2022. Involving almost 4,000 scholars and other country experts, V-Dem measures hundreds of different attributes of democracy. V-Dem enables new ways to study the nature, causes, and consequences of democracy embracing its multiple meanings. THE FIRST SECTION of the report shows global levels of democ- racy sliding back and advances made over the past 35 years diminishing. Most of the drastic changes have taken place within the last ten years, while there are large regional variations in relation to the levels of democracy people experience. The second section offers analyses on the geographies and population sizes of democratizing and autocratizing countries. In the third section we focus on the countries undergoing autocratization, and on the indicators deteriorating the most, including in relation to media censorship, repression of civil society organizations, and academic freedom. While disinformation, polarization, and autocratization reinforce each other, democracies reduce the spread of disinformation. This is a sign of hope, of better times ahead. And this is precisely the message carried forward in the fourth section, where we switch our focus to examples of countries that managed to push back and where democracy resurfaces again. Scattered over the world, these success stories share common elements that may bear implications for international democracy support and protection efforts. The final section of this year’s report offers a new perspective on shifting global balances of economic and trade power as a result of autocratization…(More)”.

When Ideology Drives Social Science


Article by Michael Jindra and Arthur Sakamoto: Last summer in these pages, Mordechai Levy-Eichel and Daniel Scheinerman uncovered a major flaw in Richard Jean So’s Redlining Culture: A Data History of Racial Inequality and Postwar Fiction, one that rendered the book’s conclusion null and void. Unfortunately, what they found was not an isolated incident. In complex areas like the study of racial inequality, a fundamentalism has taken hold that discourages sound methodology and the use of reliable evidence about the roots of social problems.

We are not talking about mere differences in interpretation of results, which are common. We are talking about mistakes so clear that they should cause research to be seriously questioned or even disregarded. A great deal of research — we will focus on examinations of Asian American class mobility — rigs its statistical methods in order to arrive at ideologically preferred conclusions.

Most sophisticated quantitative work in sociology involves multivariate research, often in a search for causes of social problems. This work might ask how a particular independent variable (e.g., education level) “causes” an outcome or dependent variable (e.g., income). Or it could study the reverse: How does parental income influence children’s education?

Human behavior is too complicated to be explained by only one variable, so social scientists typically try to “control” for various causes simultaneously. If you are trying to test for a particular cause, you want to isolate that cause and hold all other possible causes constant. One can control for a given variable using what is called multiple regression, a statistical tool that parcels out the separate net effects of several variables simultaneously.

If you want to determine whether income causes better education outcomes, you’d want to compare everyone from a two-parent family, since family status might be another causal factor, for instance. You’d also want to see the effect of family status by comparing everyone with similar incomes. And so on for other variables.

The problem is that there are potentially so many variables that a researcher inevitably leaves some out…(More)”.

Nudging: A Tool to Influence Human Behavior in Health Policy


Book by František Ochrana and Radek Kovács: “Behavioral economics sees “nudges” as ways to encourage people to re-evaluate their priorities in such a way that they voluntarily change their behavior, leading to personal and social benefits. This book examines nudging as a tool for influencing human behavior in health policy. The authors investigate the contemporary scientific discourse on nudging and enrich it with an ontological, epistemological, and praxeological analysis of human behavior. Based on analyses of the literature and a systemic review, the book defines nudging tools within the paradigm of prospect theory. In addition to the theoretical contribution, Nudging also examines and offers suggestions on the practice of health policy regarding obesity, malnutrition, and especially type 2 diabetes mellitus…(More)”.

The Future of Compute


Independent Review by a UK Expert Panel: “…Compute is a material part of modern life. It is among the critical technologies lying behind innovation, economic growth and scientific discoveries. Compute improves our everyday lives. It underpins all the tools, services and information we hold on our handheld devices – from search engines and social media, to streaming services and accurate weather forecasts. This technology may be invisible to the public, but life today would be very different without it.

Sectors across the UK economy, both new and old, are increasingly reliant upon compute. By leveraging the capability that compute provides, businesses of all sizes can extract value from the enormous quantity of data created every day; reduce the cost and time required for research and development (R&D); improve product design; accelerate decision making processes; and increase overall efficiency. Compute also enables advancements in transformative technologies, such as AI, which themselves lead to the creation of value and innovation across the economy. This all translates into higher productivity and profitability for businesses and robust economic growth for the UK as a whole.

Compute powers modelling, simulations, data analysis and scenario planning, and thereby enables researchers to develop new drugs; find new energy sources; discover new materials; mitigate the effects of climate change; and model the spread of pandemics. Compute is required to tackle many of today’s global challenges and brings invaluable benefits to our society.

Compute’s effects on society and the economy have already been and, crucially, will continue to be transformative. The scale of compute capabilities keeps accelerating at pace. The performance of the world’s fastest compute has grown by a factor of 626 since 2010. The compute requirements of the largest machine learning models has grown 10 billion times over the last 10 years. We expect compute demand to significantly grow as compute capability continues to increase. Technology today operates very differently to 10 years ago and, in a decade’s time, it will have changed once again.

Yet, despite compute’s value to the economy and society, the UK lacks a long-term vision for compute…(More)”.

The Expanding Use of Technology to Manage Migration


Report by ​Marti Flacks , Erol Yayboke , Lauren Burke and Anastasia Strouboulis: “Seeking to manage growing flows of migrants, the United States and European Union have dramatically expanded their engagement with migration origin and transit countries. This increasingly includes supporting the deployment of sophisticated technology to understand, monitor, and influence the movement of people across borders, expanding the spheres of interest to include the movement of people long before they reach U.S. and European borders.

This report from the CSIS Human Rights Initiative and CSIS Project on Fragility and Mobility examines two case studies of migration—one from Central America toward the United States and one from West and North Africa toward Europe—to map the use and export of migration management technologies and the associated human rights risks. Authors Marti Flacks, Erol Yayboke, Lauren Burke, and Anastasia Strouboulis provide recommendations for origin, transit, and destination governments on how to incorporate human rights considerations into their decisionmaking on the use of technology to manage migration…(More)”.