Explore our articles
View All Results

Stefaan Verhulst

OECD: “People in Latin American and the Caribbean are more optimistic than the OECD average about their governments’ ability to tackle complex global challenges, even as overall levels of trust in government remain lower according to a new OECD report.

OECD Survey on Drivers of Trust in Public Institutions in Latin America and the Caribbean: 2025 Results is the first regional initiative conducted under the Global Trust Survey Project. Covering ten Latin American and Caribbean countries*, the survey explores participants’ experiences with and expectations of their governments across key areas such as reliability, responsiveness, ability to manage long-term and global challenges, integrity, fairness, and openness.

Across the ten countries surveyed: on average, 35% of people express high or moderately high trust in the national government, while around half (48%) report low or no trust. Public trust varies significantly across countries and institutions. The armed forces, police, and media are more trusted than the judiciary, the civil service, legislatures and political parties…

Trust also varies across population groups, with trust in public institutions lower among those with financial, security and discrimination concerns, and among women and younger people.  Perceptions of political voice and partisanship are more strongly associated with trust gaps than socio-economic and demographic characteristics.

People who feel they have a say and the government listens to them are three times more likely to trust their government than those who do not. Currently, only 25% of respondents feel they have a say in government decisions, and just 36% believe national governments are held accountable by legislatures…(More)”.

Greater use of evidence and public input in policymaking could strengthen trust in Latin American and Caribbean public institutions

Book by Slavko Splichal: “…explores the evolving nature of publicness in the era of digital communication and social media saturation, arguing that the rise of the “gig public” represents a new paradigm that challenges the traditional conceptualization of the public in shaping social and political change. The gig public departs from traditional notions of publicness and the public, characterized by individuals’ spontaneous and less-structured engagement in public discourse. This engagement is often hampered by challenges in fostering sustained interaction and depth of discussion, due to the ephemeral nature of online interactions.
In particular, this monograph highlights the importance of customs, negotiations, and contracts that complement the normatively privileged public reasoning in public domains. It examines the transformations in the multifaceted nature of the public and its interrelationship with other social structures amid the shifting boundaries between public and private domains. In addition, it explores the evolution of conceptualizations of publicness and related concepts within critical theory, illustrating how contemporary shifts are redefining civic engagement and the essence of public life in a rapidly changing world. From these perspectives, the study is structured around three primary focal points: First, it analyzes how new information technologies and AI have altered human interactions within the public sphere. Second, it examines the impact of capitalist economic dynamics and governmentality strategies on reshaping the public realm, fundamentally altering the essence of the public and its democratic potential. Third, it explores how habitual and routine practices traditionally associated with the private sphere are now influencing the ongoing evolution of publicness…(More)”.

The Gig Public

Paper by Yael Borofsky et al: “Infrastructure inequities define modern cities. This Perspective reflects the viewpoint of a transdisciplinary group of co-authors working to advance infrastructural equity in low-income urban contexts. We argue that methodological silos and data fragmentation undermine the creation of a knowledge base to support coordinated action across diverse actors. As technological advances make it possible to ‘see’ informal settlements without engaging residents, our agenda advocates for (1) the integration of diverse methodological and epistemological traditions; (2) a focus on research that informs context-specific action; and (3) a commitment to ethical standards that center affected communities in efforts to improve infrastructure access…(More)”.

An agenda for data-rich, action-oriented, ethical research on infrastructure in informal settlements

Report by Katherine Barrett and Richard Greene: “State governments are increasingly exploring how GenAI can streamline operations, enhance service delivery, and support policy innovation—while safeguarding human judgment, transparency, and accountability that define public governance.

Through an in-depth review of current pilot projects, emerging use cases, and early implementation lessons, the authors offer a forward-looking perspective on how GenAI can serve as a collaborative partner for state employees. The report maps areas where AI can complement, augment, or automate tasks within diverse state functions, from public health and transportation to education and environmental management.

Key recommendations include fostering cross-agency learning networks, investing in targeted workforce training and upskilling, and adopting governance frameworks that balance innovation with responsible use. By following these strategies, states can cultivate a technologically empowered and resilient workforce in an era of rapid digital change…(More)”.

AI in State Government

Article by Jenny Gross: “…About 1,500 letters are sent once a year to randomly selected residents in Ostbelgien. Of those who indicate interest, about 30 are chosen to join the citizens’ assembly.

Starting in September, they meet on Saturdays for several hours over a period of two months, or longer if needed, and are assigned a topic. Each participant is paid a stipend of about 115 euros ($133) per day. They gather in the regional parliament building, which served as a military hospital during World War II, with a moderator employed by the government facilitating the discussions.

Though the assemblies’ recommendations are not binding, lawmakers are required to consider them, and many have been adopted. Among the changes they have spearheaded: easing eligibility requirements for low-income housing; including residents’ family members on the boards of assisted-living facilities; and new funding to encourage young people to take up professions such as nursing, which is facing a shortage in the region.The Belgian experiment recalls ancient Athenian democracy, in the 5th century B.C., when groups of free men were chosen at random to serve as government officials each year. There wasn’t much diversity in that citizenry, however, and these days, leaders in Eupen, the capital of Ostbelgien, acknowledge that what works in their small, relatively homogenous region may not translate everywhere.

The assemblies’ purview is also limited, naturally, to areas where the regional government has control, such as education and housing, rather than more divisive topics like the entry of immigrants which is overseen by the federal government in Brussels…(More)”.

Democracy Is in Trouble. This Region Is Turning to Its People.

Paper by Megan A Brown et al: “Scientists across disciplines often use data from the internet to conduct research, generating valuable insights about human behavior. However, as generative artificial intelligence relying on massive text corpora becomes increasingly valuable, platforms have greatly restricted access to data through official channels. As a result, researchers will likely engage in more web scraping to collect data, introducing new challenges and concerns for researchers. This paper proposes a comprehensive framework for web scraping in social science research for U.S.-based researchers, examining the legal, ethical, institutional, and scientific factors that we recommend researchers consider when scraping the web. We present an overview of the current regulatory environment impacting when and how researchers can access, collect, store, and share data via scraping. We then provide researchers with recommendations to conduct scraping in a scientifically legitimate and ethical manner. We aim to equip researchers with the relevant information to mitigate risks and maximize the impact of their research amid this evolving data access landscape…(More)”.

Web scraping for research: Legal, ethical, institutional, and scientific considerations

Paper by Erick Elejalde et al: “This study examines behavioral responses after mobile phone evacuation alerts during the February 2024 wildfires in Valparaíso, Chile. Using anonymized mobile network data from 580,000 devices, we analyze population movement following emergency SMS notifications. Results reveal three key patterns: (1) initial alerts trigger immediate evacuation responses with connectivity dropping by 80% within 1.5 h, while subsequent messages show diminishing effects; (2) substantial evacuation also occurs in non-warned areas, indicating potential transportation congestion; (3) socioeconomic disparities exist in evacuation timing, with high-income areas evacuating faster and showing less differentiation between warned and non-warned locations. Statistical modeling demonstrates socioeconomic variations in both evacuation decision rates and recovery patterns. These findings inform emergency communication strategies for climate-driven disasters, highlighting the need for targeted alerts, socioeconomically calibrated messaging, and staged evacuation procedures to enhance public safety during crises…(More)”.

Use of mobile phone data to measure behavioral response to SMS evacuation alerts

Article by Roeland Beerten, Johannes Jütting and Stefaan G. Verhulst in Le Monde: “Les statistiques officielles – fondement d’une gouvernance fondée sur des faits – sont aujourd’hui prises entre deux feux : la politique et la défiance du public. Dans certains pays, les agences sont marginalisées ; dans d’autres, les citoyens doutent de leur utilité. Si les systèmes statistiques ne parviennent pas à refléter les réalités vécues, s’ils ne fournissent que de simples moyennes abstraites, ils risquent de devenir une victime de plus de la crise de confiance démocratique.

« L’inflation est maîtrisée. » « Le PIB [produit intérieur brut] progresse. » « Nos villes sont sûres. » Ces affirmations, répétées à longueur de communiqués, suscitent désormais plus de méfiance que d’adhésion. Car pour de nombreux citoyens, les factures s’envolent, les économies s’amenuisent et l’avenir des générations futures s’obscurcit. Quand les chiffres contredisent l’expérience vécue, la confiance dans les experts s’effondre.

Depuis des décennies, les offices statistiques décrivent le monde à coups d’agrégats : PIB, chômage, inflation. Ces indicateurs sont utiles, mais leur logique gomme les écarts : le macro prime sur le micro, la moyenne efface la marge. Dire que « l’économie croît » n’a guère de sens pour celui qui voit son pouvoir d’achat s’éroder. D’où la montée d’une exaspération, que l’on pourrait résumer ainsi : « On ne mange pas du PIB. »..(More)”.

« Quand les chiffres contredisent l’expérience vécue, la confiance dans les experts s’effondre »

Blog by Anna Colom, Marta Poblet, and Stefaan Verhulst: “Independent and public interest media have for long been considered a key public good for societies, a pillar of democracy and accountable governance. Well funded public media systems have been found to consistently correlate with ‘healthy democracies’. However, the struggle for public interest media as a resilient sustainable public good is a long-standing one. Decades-long market fragmentation, the digital transformations turning the public sphere into multimodal networked spheres, and more recently, the entering into the mainstream of generative AI (GenAI) systems, and General Purpose AI more broadly, have kept public interest and independent media in a constant state of survival and reinvention.

The current trends in AI development -particularly large GenAI models- risk creating a vicious cycle for journalism and public interest media. These models are trained on high-quality journalistic content but the methods for capturing the data have been associated with exploitative and extractive practices of web scraping, routinely disregarding property rights, licensing, quality, (mis)representation and bias as GenAI models often use these data without providing provenance, context or integrity. Once deployed, they often draw audiences away from the very outlets that produced this content by providing AI-generated summaries and news-like outputs directly to users. As a result, media organisations lose readership, along with the advertising and subscription revenue that sustains independent reporting. With fewer resources, their ability to produce quality journalism declines, which in turn reduces the availability of trustworthy content. This cycle threatens not only the financial viability of public-interest media but also the integrity of the AI systems that depend on their work, and ultimately, the integrity of the available knowledge to the public…(More)”.

Strengthening Public Interest Media in the Age of GenAI

Article by Anjana Ahuja: “…The Genuinely Hard Problems scheme, designed to expose bright young minds each week to the world’s biggest unanswered questions, might usefully chart a course for other institutions to follow. According to Logan McCarty, a Harvard science lecturer and dean of education who is organising the classes with the scheme’s creator, neurobiology professor Jeff Lichtman, the internet and AI have lessened the need for ambitious thinkers to acquire specialised technical skills and internalise vast quantities of information…

Specialist knowledge can now be digitally retrieved in seconds; AI can mine data, construct hypotheses and design experiments. On top of that, a slender scholarly lens can obscure a wider perspective. Today, some of the biggest problems facing humanity, such as climate change and energy scarcity, tend to sprawl across disciplines rather than sit snugly within academic departments.

The primary task of scientists, the Harvard educators believe, is asking the right questions, because AI can answer even difficult queries if they are well-posed; being fearless and willing to fail, with no area of science off-limits; and doing research that is meaningful and has impact, rather than chasing quick wins…(More)”.

For scientists, the right questions are often the hardest

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday