People Have a Right to Climate Data


Article by Justin S. Mankin: “As a climate scientist documenting the multi-trillion-dollar price tag of the climate disasters shocking economies and destroying lives, I sometimes field requests from strategic consultantsfinancial investment analysts and reinsurers looking for climate data, analysis and computer code.

Often, they want to chat about my findings or have me draw out the implications for their businesses, like the time a risk analyst from BlackRock, the world’s largest asset manager, asked me to help with research on what the current El Niño, a cyclical climate pattern, means for financial markets.

These requests make sense: People and companies want to adapt to the climate risks they face from global warming. But these inquiries are also part of the wider commodification of climate science. Venture capitalists are injecting hundreds of millions of dollars into climate intelligence as they build out a rapidly growing business of climate analytics — the data, risk models, tailored analyses and insights people and institutions need to understand and respond to climate risks.

I point companies to our freely available data and code at the Dartmouth Climate Modeling and Impacts Group, which I run, but turn down additional requests for customized assessments. I regard climate information as a public good and fear contributing to a world in which information about the unfolding risks of droughts, floods, wildfires, extreme heat and rising seas are hidden behind paywalls. People and companies who can afford private risk assessments will rent, buy and establish homes and businesses in safer places than the billions of others who can’t, compounding disadvantage and leaving the most vulnerable among us exposed.

Despite this, global consultants, climate and agricultural technology start-ups, insurance companies and major financial firms are all racing to meet the ballooning demand for information about climate dangers and how to prepare for them. While a lot of this information is public, it is often voluminous, technical and not particularly useful for people trying to evaluate their personal exposure. Private risk assessments fill that gap — but at a premium. The climate risk analytics market is expected to grow to more than $4 billion globally by 2027.

I don’t mean to suggest that the private sector should not be involved in furnishing climate information. That’s not realistic. But I worry that an overreliance on the private sector to provide climate adaptation information will hollow out publicly provided climate risk science, and that means we all will pay: the well-off with money, the poor with lives…(More)”.

A tale of two cities: one real, one virtual


Joy Lo Dico in the Financial Times: “In recent years, digital city-building has become a legitimate part of urban planning. Barcelona, Cambridge and Helsinki are among a number of European cities exploring how copies of themselves could prove useful in making their built environments sharper, faster, cleaner and greener.

What exists in real life is being rendered a second time in the digital space: creating a library of the past, an eagle’s-eye view of the present and, potentially, a vision of the future.

One of the most striking projects has been happening in Ukraine, where technology company Skeiron has, since 2022, been mapping the country’s monuments, under threat from bombing.

The project #SaveUkrainianHeritage has recorded 60 buildings, from the St Sofia Cathedral in Kyiv and the Chernivtsi National University — both Unesco world heritage sites — to wooden churches across the country, something Skeiron’s co-founder Yurii Prepodobnyi mentions with pride. There are thousands of them. “Some are only 20 or 30 square metres,” he says. “But Ukrainian churches keep Ukrainian identity.”

With laser measurements, drone photography and photogrammetry — the art of stitching photographs together — Prepodobnyi and his team can produce highly detailed 3D models.

They have even managed to recreate the exterior of the Mariupol drama theatre, destroyed in the early days of the Ukraine war, after calling for photographs and drone footage.

Another project, in Pompeii, has been using similar digital techniques to capture the evolution of excavations into a 3D model. The Pompeii I. 14 Project, led by Tulane University and Indiana State University, takes the process of excavating buildings within one block of Pompeii, Insula 14, and turns it into a digital representation. Using laser measurements, iPad Pros, a consumer drone and handheld cameras, a space can be measured to within a couple of millimetres. What is relayed back along the stream is a visual record of how a room changes over thousands of years, as the debris, volcanic eruption and layers of life that went before are revealed…(More)”.

Ground Truths Are Human Constructions


Article by Florian Jaton: “Artificial intelligence algorithms are human-made, cultural constructs, something I saw first-hand as a scholar and technician embedded with AI teams for 30 months. Among the many concrete practices and materials these algorithms need in order to come into existence are sets of numerical values that enable machine learning. These referential repositories are often called “ground truths,” and when computer scientists construct or use these datasets to design new algorithms and attest to their efficiency, the process is called “ground-truthing.”

Understanding how ground-truthing works can reveal inherent limitations of algorithms—how they enable the spread of false information, pass biased judgments, or otherwise erode society’s agency—and this could also catalyze more thoughtful regulation. As long as ground-truthing remains clouded and abstract, society will struggle to prevent algorithms from causing harm and to optimize algorithms for the greater good.

Ground-truth datasets define AI algorithms’ fundamental goal of reliably predicting and generating a specific output—say, an image with requested specifications that resembles other input, such as web-crawled images. In other words, ground-truth datasets are deliberately constructed. As such, they, along with their resultant algorithms, are limited and arbitrary and bear the sociocultural fingerprints of the teams that made them…(More)”.

Why Philanthropists Should Become Heretics


Article by Mark Malloch-Brown: “…There is a legitimate role for philanthropy in troubled times, but one that has to reflect them. No longer is it enough for established figures to use foundations and other philanthropies to prop up an existing order. The world of Hoffman or Bundy no longer exists, let alone that of Carnegie and Rockefeller. Today, the sector will find legitimacy only in its ability to help confront the manifold crises in ways others cannot.

In his 2018 book Just Giving, the political scientist Rob Reich brought a skeptical eye to the question of whether foundations have any valid purpose in liberal democracies but concluded that they can indeed be beneficial by fulfilling roles that only they can take on, through their distinctive constitutions. Reich identified two in particular: pluralism (foundations can challenge orthodoxies by pursuing idiosyncratic goals without clear electoral or market rationales) and discovery (foundations can serve as the “risk capital” for democratic societies, experimenting and investing for the long term). Precisely because entities in the philanthropic sector do not answer to voters or shareholders, they can be both radically urgent and radically patient: moving faster than other actors in response to a crisis or opportunity but also possessing far greater staying power, thus the ability to back projects whose success is judged in decades rather than months.

This approach demands that those who were once secular priests—the leaders of the philanthropic sector—abandon their cassocks and accept the mantle of the heretic. Only by challenging the system and agitating on its fringes can they realize their full potential in today’s crisis-bound world…(More)”

Do Policy Schools Still Have a Point?


Article by Stephen M. Walt: “Am I proposing that we toss out the current curriculum, stop teaching microeconomics, democratic theory, public accounting, econometrics, foreign policy, applied ethics, history, or any of the other building blocks of today’s public policy curriculum? Not yet. But we ought to devote more time and effort to preparing them for a world that is going to be radically different from the one we’ve known in the past—and sooner than they think.

I have three modest proposals.

First, and somewhat paradoxically, the prospect of radical change highlights the importance of basic theories. Empirical patterns derived from past experience (e.g., “democracies don’t fight each other”) may be of little value if the political and social conditions under which those laws were discovered no longer exist. To make sense of radically new circumstances, we will have to rely on causal explanations (i.e., theories) to help us foresee what is likely to occur and to anticipate the results of different policy choices. Knowledge derived from simplistic hypothesis testing or simple historical analogies will be less useful than rigorous and refined theories that tell us what’s causing what and help us understand the effects of different actions. Even more sophisticated efforts to teach “applied history” will fail if past events are not properly interpreted. The past never speaks to us directly; all historical interpretation is in some sense dependent on the theories or frameworks that we bring to these events. We need to know not just what happened in some earlier moment; we need to understand why it happened as it did and whether similar causal forces are at work today. Providing a causal explanation requires theory.

At the same time, some of our existing theories will need to be revised (or even abandoned), and new ones may need to be invented. We cannot escape reliance on some sort of theory, but rigid and uncritical adherence to a particular worldview can be just as dangerous as trying to operate solely with one’s gut instincts. For this reason, public policy schools should expose students to a wider range of theoretical approaches than they currently do and teach students how to think critically about them and to identify their limitations along with their strengths…(More)”.

We could all learn a bit about democracy from Austrian millionaire Marlene Engelhorn


Article by Seána Glennon: “In the coming week, thousands of households across Austria will receive an invitation to participate in a citizens’ assembly with a unique goal: to determine how to spend the €25 million fortune of a 31-year-old heiress, Marlene Engelhorn, who believes that the system that allowed her to inherit such a vast sum of money (tax free) is deeply flawed.

Austria, like many countries across the world, suffers from a wealth gap: a small percentage of the population controls a disproportionate amount of wealth and attendant power.

Engelhorn is not alone in calling out this unfairness; in the US, where wealth inequality has been rising for decades, a small number of the super-rich are actually pushing for higher taxes to support public services.

The Austrian experiment is somewhat unique, however, in seeking to engage ordinary citizens in directly determining how a substantial fortune should be distributed…(More)”.

The New Digital Dark Age


Article by Gina Neff: “For researchers, social media has always represented greater access to data, more democratic involvement in knowledge production, and great transparency about social behavior. Getting a sense of what was happening—especially during political crises, major media events, or natural disasters—was as easy as looking around a platform like Twitter or Facebook. In 2024, however, that will no longer be possible.

In 2024, we will face a grim digital dark age, as social media platforms transition away from the logic of Web 2.0 and toward one dictated by AI-generated content. Companies have rushed to incorporate large language models (LLMs) into online services, complete with hallucinations (inaccurate, unjustified responses) and mistakes, which have further fractured our trust in online information.

Another aspect of this new digital dark age comes from not being able to see what others are doing. Twitter once pulsed with publicly readable sentiment of its users. Social researchers loved Twitter data, relying on it because it provided a ready, reasonable approximation of how a significant slice of internet users behaved. However, Elon Musk has now priced researchers out of Twitter data after recently announcing that it was ending free access to the platform’s API. This made it difficult, if not impossible, to obtain data needed for research on topics such as public health, natural disaster response, political campaigning, and economic activity. It was a harsh reminder that the modern internet has never been free or democratic, but instead walled and controlled.

Closer cooperation with platform companies is not the answer. X, for instance, has filed a suit against independent researchers who pointed out the rise in hate speech on the platform. Recently, it has also been revealed that researchers who used Facebook and Instagram’s data to study the platforms’ role in the US 2020 elections had been granted “independence by permission” by Meta. This means that the company chooses which projects to share its data with and, while the research may be independent, Meta also controls what types of questions are asked and who asks them…(More)”.

What It Takes to Build Democratic Institutions


Article by Daron Acemoglu: “Chile’s failure to draft a new constitution that enjoys widespread support from voters is the predictable result of allowing partisans and ideologues to lead the process. Democratic institutions are built by delivering what ordinary voters expect and demand from government, as the history of Nordic social democracy shows…

There are plenty of good models around to help both developing and industrialized countries build better democratic institutions. But with its abortive attempts to draft a new constitution, Chile is offering a lesson in what to avoid.

Though it is one of the richest countries in Latin America, Chile is still suffering from the legacy of General Augusto Pinochet’s brutal dictatorship and historic inequalities. The country has made some progress in building democratic institutions since the 1988 plebiscite that began the transition from authoritarianism, and education and social programs have reduced income inequality. But major problems remain. There are deep inequalities not just in income, but also in access to government services, high-quality educational resources, and labor-market opportunities. Moreover, Chile still has the constitution that Pinochet imposed in 1980.

Yet while it seems natural to start anew, Chile has gone about it the wrong way. Following a 2020 referendum that showed overwhelming support for drafting a new constitution, it entrusted the process to a convention of elected delegates. But only 43% of voters turned out for the 2021 election to fill the convention, and many of the candidates were from far-left circles with strong ideological commitments to draft a constitution that would crack down on business and establish myriad new rights for different communities. When the resulting document was put to a vote, 62% of Chileans rejected it…(More)”

Toward a Solid Acceptance of the Decentralized Web of Personal Data: Societal and Technological Convergence


Article by Ana Pop Stefanija et al: “Citizens using common online services such as social media, health tracking, or online shopping effectively hand over control of their personal data to the service providers—often large corporations. The services using and processing personal data are also holding the data. This situation is problematic, as has been recognized for some time: competition and innovation are stifled; data is duplicated; and citizens are in a weak position to enforce legal rights such as access, rectification, or erasure. The approach to address this problem has been to ascertain that citizens can access and update, with every possible service provider, the personal data that providers hold of or about them—the foundational view taken in the European General Data Protection Regulation (GDPR).

Recently, however, various societal, technological, and regulatory efforts are taking a very different approach, turning things around. The central tenet of this complementary view is that citizens should regain control of their personal data. Once in control, citizens can decide which providers they want to share data with, and if so, exactly which part of their data. Moreover, they can revisit these decisions anytime…(More)”.

What does it mean to trust a technology?


Article by Jack Stilgoe: “A survey published in October 2023 revealed what seemed to be a paradox. Over the past decade, self-driving vehicles have improved immeasurably, but public trust in the technology is low and falling. Only 37% of Americans said they would be comfortable riding in a self- driving vehicle, down from 39% in 2022 and 41% in 2021. Those that have used the technology express more enthusiasm, but the rest have seemingly had their confidence shaken by the failure of the technology to live up to its hype.

Purveyors and regulators of any new technology are likely to worry about public trust. In the short term, they worry that people won’t want to make use of new innovations. But they also worry that a public backlash might jeopardize not just a single company but a whole area of technological innovation. Excitement about artificial intelligence (AI) has been accompanied by a concern about the need to “build trust” in the technology. Trust—letting one’s guard down despite incomplete information—is vital, but innovators must not take it for granted. Nor can it be circumvented through clever engineering. When cryptocurrency enthusiasts call their technology “trustless” because they think it solves age-old problems of banking (an unavoidably imperfect social institution), we should at least view them with skepticism.

For those concerned about public trust and new technologies, social science has some important lessons. The first is that people trust people, not things. When we board an airplane or agree to get vaccinated, we are placing our trust not in these objects but in the institutions that govern them. We trust that professionals are well-trained; we trust that regulators have assessed the risks; we trust that, if something goes wrong, someone will be held accountable, harms will be compensated, and mistakes will be rectified. Societies can no longer rely on the face-to-face interactions that once allowed individuals to do business. So it is more important than ever that faceless institutions are designed and continuously monitored to realize the benefits of new technologies while mitigating the risks….(More)”.