The New Moral Mathematics


Book Review by Kieran Setiya: “Space is big,” wrote Douglas Adams in The Hitchhiker’s Guide to the Galaxy (1979). “You just won’t believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it’s a long way down the road to the chemist’s, but that’s just peanuts to space.”

What we do now affects future people in dramatic ways—above all, whether they will exist at all.

Time is big, too—even if we just think on the timescale of a species. We’ve been around for approximately 300,000 years. There are now about 8 billion of us, roughly 15 percent of all humans who have ever lived. You may think that’s a lot, but it’s just peanuts to the future. If we survive for another million years—the longevity of a typical mammalian species—at even a tenth of our current population, there will be 8 trillion more of us. We’ll be outnumbered by future people on the scale of a thousand to one.

What we do now affects those future people in dramatic ways: whether they will exist at all and in what numbers; what values they embrace; what sort of planet they inherit; what sorts of lives they lead. It’s as if we’re trapped on a tiny island while our actions determine the habitability of a vast continent and the life prospects of the many who may, or may not, inhabit it. What an awful responsibility.

This is the perspective of the “longtermist,” for whom the history of human life so far stands to the future of humanity as a trip to the chemist’s stands to a mission to Mars.

Oxford philosophers William MacAskill and Toby Ord, both affiliated with the university’s Future of Humanity Institute, coined the word “longtermism” five years ago. Their outlook draws on utilitarian thinking about morality. According to utilitarianism—a moral theory developed by Jeremy Bentham and John Stuart Mill in the nineteenth century—we are morally required to maximize expected aggregate well-being, adding points for every moment of happiness, subtracting points for suffering, and discounting for probability. When you do this, you find that tiny chances of extinction swamp the moral mathematics. If you could save a million lives today or shave 0.0001 percent off the probability of premature human extinction—a one in a million chance of saving at least 8 trillion lives—you should do the latter, allowing a million people to die.

Now, as many have noted since its origin, utilitarianism is a radically counterintuitive moral view. It tells us that we cannot give more weight to our own interests or the interests of those we love than the interests of perfect strangers. We must sacrifice everything for the greater good. Worse, it tells us that we should do so by any effective means: if we can shave 0.0001 percent off the probability of human extinction by killing a million people, we should—so long as there are no other adverse effects.

But even if you think we are allowed to prioritize ourselves and those we love, and not allowed to violate the rights of some in order to help others, shouldn’t you still care about the fate of strangers, even those who do not yet exist? The moral mathematics of aggregate well-being may not be the whole of ethics, but isn’t it a vital part? It belongs to the domain of morality we call “altruism” or “charity.” When we ask what we should do to benefit others, we can’t ignore the disquieting fact that the others who occupy the future may vastly outnumber those who occupy the present, and that their very existence depends on us.

From this point of view, it’s an urgent question how what we do today will affect the further future—urgent especially when it comes to what Nick Bostrom, the philosopher who directs the Future of Humanity Institute, calls the “existential risk” of human extinction. This is the question MacAskill takes up in his new book, What We Owe the Future, a densely researched but surprisingly light read that ranges from omnicidal pandemics to our new AI overlords without ever becoming bleak…(More)”.

What is the value of data? A review of empirical methods


Policy brief by Diane Coyle and Annabel Manley: “The economy has been transformed by data in recent years. Data-driven firms made up seven of the global top 10 firms by stock market capitalisation in 2021; and across the OECD (Organisation for Economic Co-operation and Development) economies there has been a growing gap in terms of productivity and profitability between firms that use data intensively and the rest (e.g. Brynjolfsson et al 2019; Bajgar et al 2022; Coyle et al 2022). The widespread availability of data and analytics has also begun to extend into the public sector and policymaking, for example with ‘following the science’ – implying intense use of data – becoming a tagline for the handling of the COVID-19 pandemic in the UK and elsewhere.

It is therefore obvious that data has value in an economically meaningful sense. The sources of its value and characteristics of data as an economic asset are discussed at length in our earlier Value of Data report (Coyle et al 2020a). We concluded that there is potential value to the economy as a whole from having the ability to use data, and not just to the organisations that control specific data sets. This appreciation is increasingly reflected in many policy statements of data strategy and the broader debate about the governance of data (e.g. European Parliament 2022). The value of data is also explicitly and implicitly acknowledged by firms that sell data services, and investors who take dataset assets into account in stock market valuations or mergers and acquisitions.

However, despite the broad recognition of its value, and the need to develop appropriate policy frameworks, there is still no consensus method for empirically determining the value of data. Without this, the full potential will not be realised (Verhulst 2018). There are not even many examples of markets for data that would indicate a private valuation (although not the wider social value). Yet estimates of the value of data are needed to determine an appropriate level of investment, as well as a better understanding of how data can contribute value to the economy and how to govern the collection and use of different types of data.

This brief presents an overview of a range of alternative methods for data valuation, including those proposed in the existing literature. This includes some relatively widely used methods and others that are more specialist or preliminary…(More)”.

Localising AI for crisis response


Report by Aleks Berditchevskaia and Kathy Peach, Isabel Stewart: “Putting power back in the hands of frontline humanitarians and local communities.

This report documents the results of a year-long project to design and evaluate new proof-of-concept Collective Crisis Intelligence tools. These are tools that combine data from crisis-affected communities with the processing power of AI to improve humanitarian action.

The two collective crisis intelligence tool prototypes developed were:

  • NFRI-Predict: a tool that predicts which non-food aid items (NFRI) are most needed by different types of households in different regions of Nepal after a crisis.
  • Report and Respond: a French language SMS-based tool that allows Red Cross volunteers in Cameroon to check the accuracy of COVID-19 rumours or misinformation they hear from the community while they’re in the field, and receive real-time guidance on appropriate responses.

Both tools were developed using Nesta’s Participatory AI methods, which aimed to address some of the risks associated with humanitarian AI by involving local communities in the design, development and evaluation of the new tools.

The project was a partnership between Nesta’s Centre for Collective Intelligence Design (CCID) and Data Analytics Practice (DAP), the Nepal Red Cross and Cameroon Red Cross, IFRC Solferino Academy, and Open Lab Newcastle University, and it was funded by the UK Humanitarian Innovation Hub.

We found that collective crisis intelligence:

  • has the potential to make local humanitarian action more timely and appropriate to local needs.
  • can transform locally-generated data to drive new forms of (anticipatory) action.

We found that participatory AI:

  • can overcome several critiques and limitations of AI – as well as helping to improve model performance.
  • helps to surface tensions between the assumptions and standards set by AI gatekeepers versus the pragmatic reality of implementation.
  • creates opportunities for building and sharing new capabilities among frontline staff and data scientists.

We also validated that collective crisis intelligence and participatory AI can help increase trust in AI tools, but more research is needed to untangle the factors that were responsible…(More)”.

Protecting Children in Cyberconflicts


Paper by Eleonore Pauwels: “Just as digital technologies have transformed myriad aspects of daily life, they are now transforming war, politics and the social fabric.

This rapid analysis examines the ways in which cyberconflict adversely affects children and offers actions that could strengthen safeguards to protect them.

Cyberconflict can impact children directly or indirectly. Harms range from direct targeting for influence and recruitment into armed forces and armed groups, to personal data manipulation and theft, to cyber attacks on infrastructure across sectors critical to child well-being such as education and health facilities.

Many experts believe that the combination of existing international humanitarian law, international criminal law, human rights law, and child rights law is adequate to address the emerging issues posed by cyberconflict. Nevertheless, several key challenges persist. Attribution of cyber attacks to specific actors and ensuring accountability has proven challenging, particularly in the so-called grey zone between war and peace.

There is an urgent need to clarify how child rights apply in the digital space and for Member States to place these rights at the centre of regulatory frameworks and legislation on new technologies…(More)”.

Using Wikipedia for conflict forecasting


Article by Christian Oswald and Daniel Ohrenhofer: “How can we improve our ability to predict conflicts? Scholars have struggled with this question for a long time. However, as a discipline, and especially over the last two decades, political science has made substantial progress. In general, what we need to improve predictions are advances in data and methodology. Data advances involve both improving the quality of existing data and developing new data sources. We propose a new data source for conflict forecasting efforts: Wikipedia.

The number of country page views indicates international salience of, or interest in, a country. Meanwhile, the number of changes to a country page indicate political controversy between opposing political views.

We took part in the Violence Early-Warning System’s friendly competition to predict changes in battle-related deaths. In our work, we evaluate our findings with out-of-sample predictions using held-out, previously unseen data, and true forecasts into the future. We find support for the predictive power of country page views, whereas we do not for page changes…

Globally available data, updated monthly, are ideal for (near) real-time forecasting. However, many commonly used data sources are available only annually. They are updated once a year, often with considerable delay.

Some of these variables, such as democracy or GDP, tend to be relatively static over time. Furthermore, many data sources face the problem of missing values. These occur when it is not possible to find reliable data for a variable for a given country.

Wikipedia is updated in real time, unlike many commonly used data sources, which may update only annually and with considerable delay

More recent data sources such as Twitter, images or text as data, or mobile phone data, often do not provide global coverage. What’s more, collecting and manipulating data from such sources is typically computationally and/or financially costly. Wikipedia provides an alternative data source that, to some extent, overcomes many of these limitations…(More)”.

Does public opinion shape public policy? Effect of citizen dissent on legislative outcomes


Paper by Nara Park and Jihyun Ham: “In South Korea, the Advance Notice Legislation (ANL) system requires by law that a public announcement be issued on any proposed bill that is likely to affect the fundamental rights, duties, and/or daily life of the general public. By investigating the effects of public dissent submitted via the online ANL system in South Korea, this study attempts to address the critical issue of how to increase citizen participation in the political process and to offer a possible strategy that modern democratic governments can employ in this regard. The findings suggest that citizens will actively participate in the political process to make their voices heard when an appropriate participatory mechanism is available, but they will be more active if the administration encourages citizen participation with various policies and institutions. In other words, formal and informal institutions actively interact to affect the behavior of actors both within and outside the political arena…(More)”.

Designing Data Spaces: The Ecosystem Approach to Competitive Advantage


Open access book edited by Boris Otto, Michael ten Hompel, and Stefan Wrobel: “…provides a comprehensive view on data ecosystems and platform economics from methodical and technological foundations up to reports from practical implementations and applications in various industries.

To this end, the book is structured in four parts: Part I “Foundations and Contexts” provides a general overview about building, running, and governing data spaces and an introduction to the IDS and GAIA-X projects. Part II “Data Space Technologies” subsequently details various implementation aspects of IDS and GAIA-X, including eg data usage control, the usage of blockchain technologies, or semantic data integration and interoperability. Next, Part III describes various “Use Cases and Data Ecosystems” from various application areas such as agriculture, healthcare, industry, energy, and mobility. Part IV eventually offers an overview of several “Solutions and Applications”, eg including products and experiences from companies like Google, SAP, Huawei, T-Systems, Innopay and many more.

Overall, the book provides professionals in industry with an encompassing overview of the technological and economic aspects of data spaces, based on the International Data Spaces and Gaia-X initiatives. It presents implementations and business cases and gives an outlook to future developments. In doing so, it aims at proliferating the vision of a social data market economy based on data spaces which embrace trust and data sovereignty…(More)”.

Meet the new GDP prototype that tracks inequality


Article by Greg Rosalsky: “…Nearly a century after Kuznets pioneered the use of GDP, economists Thomas Blanchet, Emmanuel Saez, and Gabriel Zucman are trying to revolutionize it. In a new paper titled “Real-Time Inequality,” the economists imagine a new kind of GDP, one that isn’t merely a single number telling us about total economic growth, but a collection of numbers telling us where the gains from this growth are flowing. They already have a working prototype that they’ve published online, and it can provide some important insights about our economy right now…

Gabriel Zucman is an economist at UC Berkeley and the director of the James M. and Cathleen D. Stone Center on Wealth and Income Inequality. He has been working to transform government economic statistics — also called “national accounts” — for almost a decade. He says the national accounts offer the public valuable insights about economic growth. However, Zucman says, “The big problem is these data do not tell you who is benefiting from economic growth.”

America, of course, already has tons of data on inequality. The problem, Zucman says, is it usually takes a year or two for this data to be updated. “It’s not enough to come in two years after the policy battle, and say, ‘Look, this is what happened to inequality,'” Zucman says. “That’s too late.”

Their new project is an effort to fix this. Cobbling together data from a variety of official sources, Zucman and his colleagues have pioneered a method to compute in a more timely fashion how different income groups — like the working class and the middle class — are doing economically. They hope this prototype will inspire the federal government to follow suit and soon “produce numbers about how income is growing for each social group at the exact time when the Bureau of Economic Analysis releases its official GDP growth numbers.”

Zucman envisions a future where this data could inform and shape policy decisions. When considering policies like sending stimulus checks or providing tax relief, Zucman says, policymakers and voters need to know things like “which groups need more support, or whether the government may be actually overshooting, which might lead to inflation.”…(More)”.

Unsustainable Alarmism


Essay by Taylor Dotson: “Covid is far from the only global challenge we see depicted as a cataclysm in the making. In 1968, Paul Ehrlich predicted impending famine and social collapse driven by overpopulation. He compared the threat to a ticking bomb — the “population bomb.” And the claim that only a few years remain to prevent climate doom has become a familiar refrain. The recent film Don’t Look Up, about a comet barreling toward Earth, is obviously meant as an allegory for climate catastrophe.

But catastrophism fails to capture the complexities of problems that play out over a long time scale, like Covid and climate change. In a tornado or a flood, which are not only undeniably serious but also require immediate action to prevent destruction, people drop political disputes to do what is necessary to save lives. They bring their loved ones to higher ground. They stack sandbags. They gather in tornado shelters. They evacuate. Covid began as a flood in early 2020, but once a danger becomes long and grinding, catastrophism loses its purchase, and more measured public thinking is required.

Even if the extension of catastrophic rhetoric to longer-term and more complex problems is well-intentioned, it unavoidably implies that something is morally or mentally wrong with the people who fail to take heed. It makes those who are not already horrified, who do not treat the crisis as an undeniable, act-now-or-never calamity, harder to comprehend: What idiot wouldn’t do everything possible to avert catastrophe? This kind of thinking is why global challenges are no longer multifaceted dilemmas to negotiate together; they have become conflicts between those who recognize the self-evident truth and those who have taken flight from reality….(More)”.

How crowdfunding is shaping the war in Ukraine


The Economist: “This month Aerorozvidka, a Ukrainian drone unit, celebrated the acquisition of four Chinese-made DJI Phantom 3 drones, provided by a German donor. The group, founded in 2014 after the Russian invasion of eastern Ukraine and annexation of Crimea, is led by civilians. The gift is just one example of crowdfunding in Russia’s latest war against Ukraine. Citizens from both sides are supplying much-needed equipment to the front lines. What is the impact of these donations, and how do the two countries differ in their approach?

Private citizens have chipped in to help in times of war for centuries. A writing tablet found near Hadrian’s Wall in northern England mentions a gift of sandals, socks and underwear for Roman soldiers. During the first world war America’s government asked civilians to knit warm clothing for troops. But besides such small morale-boosting efforts, some schemes to rally civilians have proved strikingly productive. During the second world war Britain introduced a “Spitfire Fund”, encouraging civilian groups to raise the £12,600 (£490,000, or $590,000, in today’s money) needed to build the top-of-the-range fighter. Individual contributors could buy wings, machineguns or even a rivet, for six old pence (two and a half modern ones) apiece. The scheme raised around £13m in total—enough for more than 1,000 aircraft (of a total of 20,000 built)…(More)”.