Explore our articles
View All Results

Stefaan Verhulst

Paper by DemNext: “Africa faces a paradox. Most people continue to support democratic institutions, even though their satisfaction is declining with institutions’ ability to deliver inclusive economic prosperity and accountable, responsive governance. Citizens’ assemblies offer a way forward by offering the opportunity to draw on indigenous traditions of sustained deliberation and consensus-building to tackle complex policy problems.

In this paper, we explore how citizens’ assemblies can be adapted to Africa’s diverse contexts by drawing on real-world experiences across the continent. We begin by outlining the civic strengths and cultural traditions that underpin deliberative democracy in Africa, before reviewing emerging deliberative experiments – including citizens’ assemblies – that illustrate their potential. We introduce an analytical framework to assess the strengths and limitations of citizens’ assemblies and apply it to case studies from Mali, Malawi, and The Gambia. Finally, we highlight insights from an upcoming citizens’ assembly in South Africa.

The paper serves two purposes: advancing theoretical frameworks for evaluating deliberative processes in the Global South, and offering practical guidance to foster experimentation and collaboration in democratic innovation across these contexts. Rather than proposing a single model, we identify context-sensitive strategies that help citizens’ assemblies bridge Africa’s democratic delivery gap, while building on longstanding traditions of collective decision making…(More)”.

Deliberative Democracy in Africa: Learning from Past Citizens’ Assemblies & Guidance for Future Action

Article by Stefaan Verhulst: “The world has become more complex, more dynamic and more interconnected than ever before. The challenges we face – from health to climate, from democratic resilience to economic transformation – are deeply intertwined. And we need new ideas to meet these challenges.  

Europe has never lacked intellectual ambition, but ideas alone aren’t enough. To make real progress, we need breakthrough discoveries. We need evidence of what works. And we need the institutional capacity to test, validate and scale solutions across borders and disciplines. 

That’s where science comes in. Yet good science depends on data. And if we want AI to supercharge discovery and transform science, then data becomes even more important. 

The ‘datafication’ of society

Digitalisation has led to an unprecedented datafication of society. When citizens engage with government services, visit a doctor, use a mobility platform, shop online or measure their steps and/or sleep through wearable devices, data are generated. 

But this datafication doesn’t stop with individual behaviour. It extends deep into the productive fabric of our economies. Manufacturing systemsindustrial supply chainslogistics networksenergy grids and robotic production lines are now embedded with sensors, connected devices and intelligent control systems. The implication is profound – data is no longer a by-product of digital services alone. It’s a structural feature of both our digital and physical infrastructures. 

The remarkable feature of digital data isn’t merely its volume. It’s its reusability. When done responsibly, data created for one purpose can often be reused for entirely different objectives – including scientific research. 

But there’s a fundamental constraint: access. Much of today’s most valuable data remains locked away in institutional stovepipes – within government agencies, universities and private companies. Despite its public value potential, it often remains inaccessible to scientists and public interest actors. 

Europe has taken important steps to address this data asymmetry. Open data policies have expanded transparency. The Data Governance Act and the Data Act seek to facilitate data sharing and rebalance power in data markets. Article 40 of the Digital Services Act creates pathways for vetted researchers to access platform data. The European Open Science Cloud seeks to enable the sharing of scientific data. Sectoral data spaces – including those envisioned under the European Health Data Space – and Data Labs aim to provide structured, interoperable infrastructures for data access and use. 

Yet instead of a steady expansion of access, we’re now witnessing a ‘data winter.’ Access to private sector data for research has declined in several domains. Open government data initiatives have slowed or been rolled back. Scientific datasets have become restricted or have disappeared. Open science has struggled to scale beyond pilot projects. And broader political retrenchment risks weakening some of the very infrastructures designed to enable responsible reuse. 

Generative AI’s rapid expansion has also triggered backlashLarge-scale data scraping for AI training has blurred the line between openness and extraction. Consequently, institutions and content creators have become more protective, sometimes closing access altogether. And without reliable access to diverse, high-quality data, scientific progress risks stagnation. 

What should Europe do? Three priorities stand out. 

Access shouldn’t be only supply-driven

For too long, data policy has focused on releasing datasets without clearly articulating the questions they’re meant to answer. But the value of data – and increasingly the value of AI – depends directly on the value of the question. 

In short, better questions define better discovery. 

If we want to unlock meaningful access, we must invest in what might be called ‘question science’ – the systematic identification of high-priority societal questions; the structuring of those questions so they are researchable and actionable; the mapping of those questions to existing or potential data sources; and embedding them into funding frameworks, governance mandates, and institutional strategies. 

When demand is vague, access debates remain abstract. When questions are clear, access becomes purposeful. Researchers, policymakers and data holders can align around concrete objectives. This requires structured, participatory processes that bring scientists, communities, funders and regulators together to define and prioritise the questions that matter most. ..(More)”.

Legitimate data access will determine whether European science has a bright or bleak future

Article by Melissa A. Haendel et al: “It can take many years for evidence generated in research to influence health care guidelines . Meanwhile, the vast data collected during everyday life, particularly during engagement with the health care system, remain largely untapped for public health, precision medicine, postmarket safety, and real-time decision-making. These “real-world data” (RWD) remain fragmented, proprietary, noninteroperable, and inconsistently governed. Although some approaches to RWD have demonstrated value in limited settings, their impact has remained constrained by uneven incentives, voluntary compliance, and the absence of routine auditability of data access and use. To address this, health data should be governed through federated, standards-based, community-driven models that reflect their public benefit, empower patients and communities, and foster public trust and participation. To help achieve these goals, we propose governing health data as essential infrastructure by using public utility models, defined by their public good, distributed stewardship, and public oversight.

Recently, the US Advanced Research Projects Agency for Health (ARPA-H) requested information on economic models that could lower barriers to data access, enable research, and compensate vendors to realize a health data public utility. Existing infrastructure already includes distributed data networks, publicly funded research enclaves, and privately funded platforms, and recent public and private investments have created new entrants that are improving access to health data. We argue, however, that these efforts remain limited by fragmented incentives and governance and must be complemented by reimagining legal, regulatory, and economic policies under a public utility model. Although most challenges and examples described here are drawn from the United States, the underlying lessons and opportunities are globally applicable. In this context, a public utility model addresses persistent barriers to integration, investment, and governance by converting voluntary participation into enforceable obligations, aligning financial incentives with interoperability, and embedding accountability within continuous public oversight…(More)”.

Governing real-world health data as a public utility

Book by Benjamin Recht: “In the 1940s, mathematicians set out to design computers that could act as ideal rational agents in the face of uncertainty. The Irrational Decision tells the story of how they settled on a peculiar mathematical definition of rationality in which every decision is a statistical question of risk. Benjamin Recht traces how this quantitative standard came to define our understanding of rationality, looking at the history of optimization, game theory, statistical testing, and machine learning. He explains why, now more than ever, we need to resist efforts by powerful tech interests to drive public policy and essentially rule our lives.

While mathematical rationality has proven valuable in accelerating computers, regulating pharmaceuticals, and deploying electronic commerce, it fails to solve messy human problems and has given rise to a view of a rational world that is not only overquantified but surprisingly limited. Recht shows how these mathematical methods emerged from wartime research and influenced fields ranging from economics to health care, drawing on illuminating examples ranging from diet planning to chess to self-driving cars.

Highlighting both the power and limitations of mathematical rationality, The Irrational Decision reveals why only humans can resolve fundamentally political or value-based questions and proposes a more expansive approach to decision making that is appropriately supported by computational tools yet firmly rooted in human intuition, morality, and judgment…(More)”.

The Irrational Decision: How We Gave Computers the Power to Choose for Us

Blog by Cosima Lenz, Stefaan Verhulst, and Roshni Singh: “In February, The Governance Lab and CEPS convened researchers, policymakers, funders and advocates to advance the next phase of the 100 Questions Initiative: shifting from identifying priorities to operationalising and institutionalising them within the EU.

Below are twelve takeaways for EU stakeholders.

1. Institutionalise question-driven research

Questions determine what’s measured, funded and prioritised. Questions over women’s health must be embedded upstream within EU research frameworks. This could include requiring funded proposals to outline a clear ‘question statement’, alongside establishing a public European Women’s Health Question Catalogue to guide calls, policy design and investment.

2. Frame women’s health as a competitiveness priority

Even with over EUR 2 billion invested across 1,000+ projects under Horizon 2020 and Horizon Europe, there are still gaps. Positioning women’s health innovation as a competitiveness driver would align political, financial and private-sector incentives while strengthening Europe’s global leadership.

3. Embed women’s health in the Multiannual Financial Framework (MFF)

Women’s health should be explicitly integrated across EU instruments, including the upcoming MFF. Ring-fenced funding, targeted calls and dedicated innovation challenges, potentially via the European Innovation Council, would improve accountability and reduce fragmentation…(More)”.

From questions to institutionalisation – how to embed women’s health priorities in EU research and policy

Paper by Ivan Decostanzi, Yelena Mejova and Kyriaki Kalimeri: “Timely and accurate situational reports are essential for humanitarian decision-making, yet current workflows remain largely manual, resource intensive, and inconsistent. We present a fully automated framework that uses large language models (LLMs) to transform heterogeneous humanitarian documents into structured and evidence-grounded reports. The system integrates semantic text clustering, automatic question generation, retrieval augmented answer extraction with citations, multi-level summarization, and executive summary generation, supported by internal evaluation metrics that emulate expert reasoning. We evaluated the framework across 13 humanitarian events, including natural disasters and conflicts, using more than 1,100 documents from verified sources such as ReliefWeb. The generated questions achieved 84.7 percent relevance, 84.0 percent importance, and 76.4 percent urgency. The extracted answers reached 86.3 percent relevance, with citation precision and recall both exceeding 76 percent. Agreement between human and LLM based evaluations surpassed an F1 score of 0.80. Comparative analysis shows that the proposed framework produces reports that are more structured, interpretable, and actionable than existing baselines. By combining LLM reasoning with transparent citation linking and multi-level evaluation, this study demonstrates that generative AI can autonomously produce accurate, verifiable, and operationally useful humanitarian situation reports…(More)”

A Large-Language-Model Framework for Automated Humanitarian Situation Reporting

Book edited by Kelebogile Zvobgo, and Francesca Parente: “This timely book presents a practical framework for conceptualizing and analyzing human rights issues such as repression, compliance, and transitional justice in an increasingly fraught climate for human rights globally. Emerging and established experts advance quantitative and mixed-methods research, showcasing innovative ways of measuring and evaluating multifaceted concepts.

Chapters cover a broad range of salient topics including state repression, civil society activism, compliance with international law, and transitional justice. Emphasizing that rigorous research is driven by substance, not methods, the contributing authors explain how they measure concepts that are vital to human rights research. They showcase diverse forms of evidence in descriptive and analytical studies, as well as guidance for using cutting-edge techniques like machine learning and text analysis, charting a path for future empirical human rights research…(More)”.

Innovations in Human Rights

Paper by Open Data Watch and Paris21: “Deep cuts in development financing for statistics, legitimacy issues, rapid technological change like AI, and rising expectations for more inclusive and participatory data are colliding with long-standing weaknesses in trust, capacity, and data use. For many national statistical offices (NSOs), particularly in low- and middle-income countries, this convergence amounts to a systemic data crisis that threatens their relevance, credibility, and sustainability. At the same time, these pressures create a rare opportunity to rethink how data systems are designed, governed, and embedded in society.

This paper argues that the statistical community has reached a fork in the road. Incremental adjustment alone may no longer be sufficient. Countries and the international community face a strategic choice between two broad paths, each with distinct implications for legitimacy, financing, risk, and equity. Rather than prescribing a single solution, the paper aims to provoke informed debate ahead of the 57th Session of the UN Statistical Commission…(More)”.

Data Systems at a Crossroads: Official Statistics for a New Era

Centre for HumData: “Based on analysis of the HDX Data Grids, we estimate that 68 percent of crisis data is available and up-to-date across 22 humanitarian operations, down from 74 percent in the previous year. 

The report provides details on the data available for each location, category and sub-category covered in the Data Grids. The 22 Data Grids include 411 unique datasets, which were downloaded almost four times more than the average dataset on HDX. 

In addition, the report explores changes in data availability, reflected both in the Data Grids and in conversations with data partners throughout the year. This qualitative analysis has allowed us to capture changes that would otherwise be hard to quantify. In addition, we explain the impact of AI bots on web traffic and Open Data platforms with the aggressive roll out of Large Language Models. Finally, we take a closer look at the organizations that contribute climate hazards data on HDX, which is essential for getting ahead of crises…(More)”.

The State of Open Humanitarian Data 2026

Article by Lucila Pinto, Ehsan Masood, and Subhra Priyadarshini: “Uncertainty.”, “Loss of trust.”, “Definitely a crisis.” These are some of the ways in which researchers describe the state of affairs for government data in many countries.

“There is a new type of politics that is undermining the credibility of official statistics,” says João Pedro Azevedo, chief statistician for the United Nations children’s agency UNICEF in New York City.

Official statistics are data collected and validated by both national statistical agencies and international organizations. Nearly every country has an agency for official statistics. They collect information and organize it into statistics about myriad aspects of life, including what people earn, how many individuals are employed, how well children perform in school, the quality of nutrition, how long patients have to wait for an operation, levels of air pollution and increases to average temperatures.

National agencies collect data through surveys and from secondary sources. These data sets are used by governments to inform policy, by businesses to plan for the future, and by researchers and advocacy organizations. Official statistics, such as those measuring nations’ gross domestic product (GDP), are also the foundation for monitoring progress towards the 17 UN Sustainable Development Goals, the world’s plan to end poverty and achieve environmental sustainability.

“Official statistics are like the backbone of a nation’s data infrastructure,” says Steve Pierson, director of science policy at the American Statistical Association (ASA) in Washington DC. “Just like any other infrastructure — roads, bridges and highways — they cannot fail.”

People who work with or study official statistics say that they have never experienced a period similar to today’s situation. Those who call the current state a crisis think it has been triggered by an accumulation of overlapping factors. These include falling response rates to national surveys, cuts to funding and, in some cases, government interference.

Although funded by governments, national statistics offices are expected to operate independently of politicians, not least so that they are free to report the data as measured — much as academic research operates at arm’s length from its public-funding bodies. Moreover, rules established by an assembly of the world’s national statisticians and endorsed by the UN require that some data sets meet international standards, which state that official statistics should be accurate, impartial, trustworthy and grounded in evidence.

Although there is a history of inappropriate government involvement in the collection and reporting of national statistics (A. V. Georgiou Stat. J. IAOS 37, 85–105; 2021), there is a record of statistics agencies calling out the misuse of such data, too. But researchers worry that this might not be the case in future. “I fear that it is becoming harder for official statisticians to do their jobs,” says Diane Coyle, research director at the Bennett School of Public Policy at the University of Cambridge, UK.

Nature explores problems with official statistics in four countries that are causing concern for researchers and statisticians…(More)”.

National statistics are in crisis around the world — and the impacts will be severe

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday