Explore our articles
View All Results

Stefaan Verhulst

Article by Mahvish Shaukat et al: “Many governments and policymakers rely on policy-advising organisations – international development banks, think tanks, ministries – to translate academic research into actionable recommendations. Yet better evidence does not automatically produce better policy. Even when high-quality research exists, it must travel through layers of hierarchy inside a policy-advising organisation, both upward and downward. A junior analyst may surface a finding that never reaches the decision-maker who could act on it. Equally, a senior leader’s review of the evidence may never filter down to the operational level. Each step in the chain is a potential bottleneck; accordingly, the evidence-to-policy pipeline increasingly impedes the use of rigorous research in practice (DellaVigna et al. 2024, Garcia-Hombrados et al. 2025, Bonargent 2024, Rao 2024).

A growing evidence base examines how policymakers engage with evidence (Vivalt and Coville 2023, Toma and Bell 2024), and how training can build capacity for evidence use (Crowley et al. 2021, Mehmood et al. 2024), but much less is known about what drives evidence diffusion within organisations. Who shares evidence with whom? Does it depend on where in the hierarchy evidence first lands? Do concerns about how peers might react shape whether sharing happens? These are the questions we set out to answer…(More)”.

How does evidence diffuse through organisations?

Report by Reema Patel: “Our collective thinking about data governance is shaped by unconscious beliefs about the world. These are sometimes described as mental models. Our mental models shape our sense of what problems are noticed and what solutions to these problems are feasible and possible. They can sometimes limit our understanding of important issues such as how data can be governed and managed. Our current mental models about data are failing. The ongoing data trust deficit, public concern about data governance approaches, poor data quality, datasets with systemic bias and inequality that shape artificial intelligence, and repeated data governance systems failures. These all point to the need to dramatically reshape the way we think about data governance…


This report maps out ten different mental models of data governance. These are: data colonialism, data ownership, data control, data technocracy, data liberation, data protection, data justice, data sovereignty, data culture, and data stewardship. Understanding how we think about data governance as our mental models, I argue, is an essential first step towards moving beyond current approaches to realising a just and viable data governance future…(More)”.

Living well with data: stewardship as a just and viable paradigm

Blog by Timber Stinson-Schroff: “…The mission of this institute is to advance the theory and practice of protocol design, analysis, and stewardship across domains, as well as promote protocol literacy, appreciation and cultural salience globally. In other words, our mission is to build the field and community capable of stewarding the ongoing planetary processes of protocolization – the slow, largely invisible means by which human behaviors become standardized into the coordinating infrastructure of civilization.

The Protocol Institute inherits the work of its predecessor, the Summer of Protocols (SoP) program, which ran from 2023 to 2025. The Ethereum Foundation initiated SoP with a bold thesis: deepened understanding of protocols generally would enable better governance of the core Ethereum protocol specifically. As a seasonal grants program, SoP was designed to:

  • Bootstrap a new field of study around protocols
  • Establish protocols as a first-class concept for thinking about and acting in the world
  • Seed a scene and improve literacy around protocols

The program not only succeeded in these objectives, it went beyond them, sparking a rich discourse spanning many domains, such as robotics, climate, government, natural resources, insurance, programmable cryptography, economics, urban planning, health, gaming, encryption, wildfire management and more. Through its successes, both planned and unplanned, SoP has created the need for a suitable vehicle to sustain long-term activities building on what has already been accomplished…(More)”.

The Protocol Institute

Paper by Valentine Goddard and Dr. Leslie Salgado Arzuaga: “The rapid expansion of generative artificial intelligence is profoundly reshaping how cultural and knowledge resources are created, shared, and governed, exposing significant gaps in existing frameworks for understanding, protection, and oversight. While intellectual property regimes (IPRs) remain one of the primary mechanisms available to artists, creators, cultural workers, and Indigenous knowledge holders to protect their work, safeguard cultural heritage, and derive fair value from their contributions, they are increasingly strained by the scale, speed and opacity of AI systems, which often rely on vast amounts of data drawn from public, proprietary and traditional knowledge sources. At the same time.

At the same time, these same frameworks can enable the privatization and appropriation of public domain knowledge and cultural commons with proprietary AI systems, creating tensions between artists’ economic rights, cultural sovereignty, and broader economic development rights, particularly for communities in the Global Majority. These impacts are gendered and intersectional, disproportionately affecting women and communities whose knowledge, labour and decision making authority have been historically undervalued or excluded, contributing to labour precarity, cultural erasure, and unequal access to decision making. Furthermore, there is also a clear lack of accessible, independent, and balanced information to support civil society, cultural actors, and policymakers in navigating these complex dynamics. In this context, the creation of a dedicated, civil society-led and collaboratively designed Repository emerged as a necessary response to facilitate knowledge sharing, surface diverse perspectives, share best practices, protect digital cultural sovereignty, and support more equitable, informed, rightsbased, and culturally sensitive approaches to AI governance in the cultural sphere…(More)”.

Democratic Infrastructure for Creative Futures: Building the AI, IP & Culture Repository

Report by Luca Picci and Jorge Rivera: “The development community is driving down a winding road while looking in the rearview mirror. Official statistics tell us where official development assistance (ODA) stood a year ago, with precision and authority. But they don’t tell us the road has turned until after the fact.

That’s a problem. It means that programme decisions, policy responses, and advocacy strategies are being made on the basis of incomplete or outdated data. In 2025, the four largest Development Assistance Committee (DAC) donors—the United States, Germany, the United Kingdom, and France—all cut ODA. Initial projections estimated that ODA would fall by between 9% and 17% in 2025 [1]; preliminary ODA figures published in April 2026 suggest that ODA fell by 23.1% [2]. Official detailed data for the 2025 cuts, disaggregated by donor, sector, and recipient, will not be available until December 2026, however, and detailed data on 2026 ODA will not be available until late 2027.

Nowcasting methods have been developed to address this problem. They estimate the current value of a lagged official statistic using data published at a higher frequency, updating estimates as new information arrives, and quantifying uncertainty. Nowcasting provides a blurred view through the windshield; still not a clear picture of the road, but enough to spot the turn.

This paper reviews the ODA data landscape and existing nowcasting approaches, and assesses which merit further investigation in the context of ODA. Our conclusion is that a nowcasting system for ODA is within methodological reach. Such a system could produce estimates that get updated as new information arrives, ahead of official data publication. Initially, this system would estimate ODA at the donor level for major DAC donors, with finer disaggregation contingent on what the available data can support…(More)”.

Nowcasting for Official Development Assistance

Discussion Paper by the World Health Organization: “Artificial intelligence (AI) is increasingly shaping evidence-informed policy-making (EIP) in health by enabling faster analysis, synthesis and use of large and diverse data sources across the policy cycle. This discussion paper examines the intersection of AI and EIP, outlining how AI can support problem identification, policy design and implementation through enhanced data integration, predictive modelling, scenario simulation and adaptive feedback. It emphasizes that AI augments rather than replaces human judgement, while highlighting its potential to expand the evidence base and support more timely, responsive and iterative decision-making in complex health contexts. The paper analyses key risks and challenges associated with AI integration, including bias, opacity, equity concerns, data governance and regulatory gaps, and considers how these may affect different stages of the policy cycle. It reviews relevant governance frameworks and identifies areas of alignment between EIP and AI governance traditions. It further proposes practical considerations for responsible implementation, including human oversight, multidisciplinary collaboration, living evidence approaches and risk-based regulation. Intended for policy-makers, regulators and health stakeholders, the document provides a structured basis for leveraging AI to strengthen policy processes, improve health outcomes and maintain public trust…(More)”.

Artificial intelligence and evidence-informed policy: emerging challenges and opportunities

Book by Helen Pearson: “Today, more and more people around the globe are using scientific evidence to figure out what works—in health, government and business as well as conservation, schools and parenting. This wasn’t always the case. This book tells the story of the evidence revolution—a worldwide movement that promotes evidence-based thinking—and shows how it can help us all, especially in an age of alternative facts.

For many years, most medical advice was based on doctors’ opinions and conventional wisdom, not solid science. Helen Pearson describes how evidence-based medicine swept the world in the 1990s—becoming the predominant form of medicine practiced today—and how the idea that evidence should guide decisions is quietly transforming a host of other fields as well. Do police patrols reduce crime? Do performance appraisals boost job performance? Do welfare programs help the poor? Do smaller classes aid learning? Do smartphones harm teenagers? At a time when science is under attack and questionable claims run rampant, Pearson underscores the importance of evidence in all facets of our lives, empowering each of us to sift fact from falsehood and misinformation from the truth…(More)”.

Beyond Belief: How Evidence Shows What Really Works

Article by the Department for Enterprise (Isle of Man): “The legislation establishes the statutory framework for Data Asset Foundations, enabling data to be recognised, governed and managed as an asset within a clear legal structure.

Built on the Island’s existing Foundations Act 2011, it creates a new capability for businesses and marks a defining milestone in the Island’s wider digital and economic development. It establishes a world-first statutory framework to recognise data as an asset, placing the Isle of Man at the forefront of how data is treated within the global economy.

The initiative has been developed by Digital Isle of Man as part of the Island’s long-term Economic Strategy to leverage the Island’s strengths in regulation and security and offer a unique proposition for data businesses.

As data becomes increasingly central to global economic activity, organisations are looking for trusted and practical ways to govern and use it. By putting the legal foundations in place now, the Isle of Man is creating the conditions for new forms of economic activity, investment and high-value jobs across technology, professional services and data-led sectors.

In practical terms, this opens up new commercial opportunities, from data valuation and licensing to new fiduciary and assurance services, supporting both existing businesses and new entrants to the Island.

Businesses will be able to use the Data Asset Foundations framework to securely share data with partners without losing control of it, or to demonstrate its value as part of raising investment, creating new pathways for growth while maintaining strong governance.

For the Isle of Man, this is about creating new sources of economic growth, supporting high-value jobs and ensuring the Island remains competitive as global markets continue to evolve, while reinforcing its reputation as a trusted and well-regulated jurisdiction….More information about Data Asset Foundations is available at: www.digitalisleofman.com/data-asset-foundations..(More)”

Isle of Man passes world-first legislation to establish data as an asset

About: “An open-source toolkit for comparing and conflating Points of Interest (POIs) across major geospatial datasets.

OpenPOIs downloads current US-wide POI snapshots from multiple publicly available sources — currently OpenStreetMap and Overture Maps — and conflates them into a single unified dataset. The web map lets you explore each source side by side. Each POI in the conflated dataset is given a confidence score, which is the probability that the POI currently exists based on available data from both sources…(More)”.

OpenPOIs

Article by Tiago C. Peixoto: “…For over a decade, those focused on the demand side of open data paid, and rightly so, lots of attention on who would use the data, and how. AI solves the demand-side problem. But the moment you build the agent and point it at real government data, you discover a supply-side problem that was always there but never fully exposed. The techno-mediator bottleneck was masking it. When only a handful of skilled developers and data journalists could query government APIs, the partial nature of the data caused limited damage. The few who did the query had enough domain expertise to cross-reference. AI removes that containment. If millions of citizens can now query budget data through AI agents, and the data systematically undercounts by a factor of five, the result is not accountability at scale. It is misinformation at scale, laundered through the authority of clean data and confident AI responses.

To be clear: the open data movement never assumed the data was already “out there.” The whole point was to advocate for its release. The problem came after. When governments did start publishing, the shortage of people who could query and assess the data meant that its quality went, in many cases, largely unexamined. The mediation failure that reduced the usefulness of open data for accountability purposes also made it less useful for quality control. If almost nobody can check whether a budget API returns 20% or 100% of the real figures, governments face no cost for publishing incomplete extracts. The very conditions that weakened the demand side gave the supply side room to underdeliver, and to receive credit for it. Rather than a communication trick, openwashing was an architectural possibility created by the absence of capable users. And it was sustained by an institutional environment in which there was no requirement that a public-facing API reconcile with the government’s full internal financial records, no audit of coverage, and no penalty for publishing a clean but partial extract…(More)”.

Openwashing by Architecture: How AI Reveals Budget Opacity

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday