Explore our articles
View All Results

Stefaan Verhulst

Article by Matt Prewitt: “…Markets have always required some form of protectionist intervention — like intellectual property law — to help foster innovation. In recent years, startups have innovated because of a rich symbiosis with tech giants and their monopoly-seeking investors. Startups are indeed hungry, but their hunger is not to serve consumer needs or the national interest; it is to join the happy ranks of the monopolists. The nature of technological innovation is that competitive markets, without being “managed,” do not inspire it.

Today, this may sound bizarre, heterodox and jarring. But it was once fairly mainstream opinion. In the middle part of the 20th century, many of America’s most celebrated economic minds taught that competitive markets cause technological progress to stagnate. During the neoliberal era that followed, from the 1980s to the 2010s, this idea was largely forgotten and pushed to the margins of politics and academia. But it never lost its kernel of truth

Where to from here? Both sides of the battered American center must first face their mistakes. This will be painful, not only because their errors have resulted in profound and long-term mis-governance, but also because their blind spots are deeply entangled with old, hard-to-kick ideological habits. The center-left’s sunny techno-utopianism traces its roots back to the rationalism of the French Revolution, via Karl Marx and early 20th-century progressives. The center-right’s fervent market fundamentalism is equally a relic of bygone eras, reflecting the thought of Friedrich Hayek and Milton Friedman — idea-warriors who pitched competitive markets as a cure-all largely to one-up the utopian promises of their techno-optimistic progressive foes. Thus, today, center-right and center-left thinking both feel like artifacts from yesterday’s ideological trenches. A new form of centrism that wants to speak to 2026 would need to thoroughly clear the decks…(More)”

The Progress Paradox

Article by Sarah Perez: “In a blog post, the Wikimedia Foundation, the organization that runs the popular online encyclopedia, called on AI developers to use its content “responsibly” by ensuring its contributions are properly attributed and that content is accessed through its paid product, the Wikimedia Enterprise platform.

The opt-in, paid product allows companies to use Wikipedia’s content at scale without “severely taxing Wikipedia’s servers,” the Wikimedia Foundation blog post explains. In addition, the product’s paid nature allows AI companies to support the organization’s nonprofit mission.

While the post doesn’t go so far as to threaten penalties or any sort of legal action for use of its material through scraping, Wikipedia recently noted that AI bots had been scraping its website while trying to appear human. After updating its bot-detection systems, the organization found that its unusually high traffic in May and June had come from AI bots that were trying to “evade detection.” Meanwhile, it said that “human page views” had declined 8% year-over-year.

Now Wikipedia is laying out its guidelines for AI developers and providers, saying that generative AI developers should provide attribution to give credit to the human contributors whose content it uses to create its outputs…(More)”.

Wikipedia urges AI companies to use its paid API, and stop scraping

Paper by Wenlan Zhang et al: “Big data has emerged as a critical instrument for urban planning, but is limited by reliability and representativeness. Its availability varies across space, time, and sociodemographics, especially in the Global South. This results in the emergence of digitally invisible groups-individuals who are unable to contribute to and benefit from data-informed decisions-thus exacerbating existing inequalities and further marginalising populations already at risk of exclusion. This study presents an example application, i.e., land use classification in a developing country, combining traditional geospatial data (satellite imagery, night lights, building footprints) with big data (geotagged Twitter, street view imagery) and compares random forest classification results across different data solutions using multiple evaluation methods. Results show residents of informal settlements in Nairobi are underrepresented in geotagged Twitter data, and inaccessible neighbourhoods lack streetview imagery. Relying on a single data source doesn’t improve urban mapping; biases can make some groups digitally invisible. Integrating multiple data sources is likely to mitigate these gaps by capturing different people and places, but this should be guided by a systematic, class-wise comparison rather than indiscriminate aggregation. We recommend primary data collection and participatory mapping to address blind spots and ensure these communities are represented in data-driven decision-making…(More)”.

Revealing Digitally Invisible Groups through a Machine Learning Approach Using Multi-Source Data

OECD: “People in Latin American and the Caribbean are more optimistic than the OECD average about their governments’ ability to tackle complex global challenges, even as overall levels of trust in government remain lower according to a new OECD report.

OECD Survey on Drivers of Trust in Public Institutions in Latin America and the Caribbean: 2025 Results is the first regional initiative conducted under the Global Trust Survey Project. Covering ten Latin American and Caribbean countries*, the survey explores participants’ experiences with and expectations of their governments across key areas such as reliability, responsiveness, ability to manage long-term and global challenges, integrity, fairness, and openness.

Across the ten countries surveyed: on average, 35% of people express high or moderately high trust in the national government, while around half (48%) report low or no trust. Public trust varies significantly across countries and institutions. The armed forces, police, and media are more trusted than the judiciary, the civil service, legislatures and political parties…

Trust also varies across population groups, with trust in public institutions lower among those with financial, security and discrimination concerns, and among women and younger people.  Perceptions of political voice and partisanship are more strongly associated with trust gaps than socio-economic and demographic characteristics.

People who feel they have a say and the government listens to them are three times more likely to trust their government than those who do not. Currently, only 25% of respondents feel they have a say in government decisions, and just 36% believe national governments are held accountable by legislatures…(More)”.

Greater use of evidence and public input in policymaking could strengthen trust in Latin American and Caribbean public institutions

Book by Slavko Splichal: “…explores the evolving nature of publicness in the era of digital communication and social media saturation, arguing that the rise of the “gig public” represents a new paradigm that challenges the traditional conceptualization of the public in shaping social and political change. The gig public departs from traditional notions of publicness and the public, characterized by individuals’ spontaneous and less-structured engagement in public discourse. This engagement is often hampered by challenges in fostering sustained interaction and depth of discussion, due to the ephemeral nature of online interactions.
In particular, this monograph highlights the importance of customs, negotiations, and contracts that complement the normatively privileged public reasoning in public domains. It examines the transformations in the multifaceted nature of the public and its interrelationship with other social structures amid the shifting boundaries between public and private domains. In addition, it explores the evolution of conceptualizations of publicness and related concepts within critical theory, illustrating how contemporary shifts are redefining civic engagement and the essence of public life in a rapidly changing world. From these perspectives, the study is structured around three primary focal points: First, it analyzes how new information technologies and AI have altered human interactions within the public sphere. Second, it examines the impact of capitalist economic dynamics and governmentality strategies on reshaping the public realm, fundamentally altering the essence of the public and its democratic potential. Third, it explores how habitual and routine practices traditionally associated with the private sphere are now influencing the ongoing evolution of publicness…(More)”.

The Gig Public

Paper by Yael Borofsky et al: “Infrastructure inequities define modern cities. This Perspective reflects the viewpoint of a transdisciplinary group of co-authors working to advance infrastructural equity in low-income urban contexts. We argue that methodological silos and data fragmentation undermine the creation of a knowledge base to support coordinated action across diverse actors. As technological advances make it possible to ‘see’ informal settlements without engaging residents, our agenda advocates for (1) the integration of diverse methodological and epistemological traditions; (2) a focus on research that informs context-specific action; and (3) a commitment to ethical standards that center affected communities in efforts to improve infrastructure access…(More)”.

An agenda for data-rich, action-oriented, ethical research on infrastructure in informal settlements

Report by Katherine Barrett and Richard Greene: “State governments are increasingly exploring how GenAI can streamline operations, enhance service delivery, and support policy innovation—while safeguarding human judgment, transparency, and accountability that define public governance.

Through an in-depth review of current pilot projects, emerging use cases, and early implementation lessons, the authors offer a forward-looking perspective on how GenAI can serve as a collaborative partner for state employees. The report maps areas where AI can complement, augment, or automate tasks within diverse state functions, from public health and transportation to education and environmental management.

Key recommendations include fostering cross-agency learning networks, investing in targeted workforce training and upskilling, and adopting governance frameworks that balance innovation with responsible use. By following these strategies, states can cultivate a technologically empowered and resilient workforce in an era of rapid digital change…(More)”.

AI in State Government

Article by Jenny Gross: “…About 1,500 letters are sent once a year to randomly selected residents in Ostbelgien. Of those who indicate interest, about 30 are chosen to join the citizens’ assembly.

Starting in September, they meet on Saturdays for several hours over a period of two months, or longer if needed, and are assigned a topic. Each participant is paid a stipend of about 115 euros ($133) per day. They gather in the regional parliament building, which served as a military hospital during World War II, with a moderator employed by the government facilitating the discussions.

Though the assemblies’ recommendations are not binding, lawmakers are required to consider them, and many have been adopted. Among the changes they have spearheaded: easing eligibility requirements for low-income housing; including residents’ family members on the boards of assisted-living facilities; and new funding to encourage young people to take up professions such as nursing, which is facing a shortage in the region.The Belgian experiment recalls ancient Athenian democracy, in the 5th century B.C., when groups of free men were chosen at random to serve as government officials each year. There wasn’t much diversity in that citizenry, however, and these days, leaders in Eupen, the capital of Ostbelgien, acknowledge that what works in their small, relatively homogenous region may not translate everywhere.

The assemblies’ purview is also limited, naturally, to areas where the regional government has control, such as education and housing, rather than more divisive topics like the entry of immigrants which is overseen by the federal government in Brussels…(More)”.

Democracy Is in Trouble. This Region Is Turning to Its People.

Paper by Megan A Brown et al: “Scientists across disciplines often use data from the internet to conduct research, generating valuable insights about human behavior. However, as generative artificial intelligence relying on massive text corpora becomes increasingly valuable, platforms have greatly restricted access to data through official channels. As a result, researchers will likely engage in more web scraping to collect data, introducing new challenges and concerns for researchers. This paper proposes a comprehensive framework for web scraping in social science research for U.S.-based researchers, examining the legal, ethical, institutional, and scientific factors that we recommend researchers consider when scraping the web. We present an overview of the current regulatory environment impacting when and how researchers can access, collect, store, and share data via scraping. We then provide researchers with recommendations to conduct scraping in a scientifically legitimate and ethical manner. We aim to equip researchers with the relevant information to mitigate risks and maximize the impact of their research amid this evolving data access landscape…(More)”.

Web scraping for research: Legal, ethical, institutional, and scientific considerations

Paper by Erick Elejalde et al: “This study examines behavioral responses after mobile phone evacuation alerts during the February 2024 wildfires in Valparaíso, Chile. Using anonymized mobile network data from 580,000 devices, we analyze population movement following emergency SMS notifications. Results reveal three key patterns: (1) initial alerts trigger immediate evacuation responses with connectivity dropping by 80% within 1.5 h, while subsequent messages show diminishing effects; (2) substantial evacuation also occurs in non-warned areas, indicating potential transportation congestion; (3) socioeconomic disparities exist in evacuation timing, with high-income areas evacuating faster and showing less differentiation between warned and non-warned locations. Statistical modeling demonstrates socioeconomic variations in both evacuation decision rates and recovery patterns. These findings inform emergency communication strategies for climate-driven disasters, highlighting the need for targeted alerts, socioeconomically calibrated messaging, and staged evacuation procedures to enhance public safety during crises…(More)”.

Use of mobile phone data to measure behavioral response to SMS evacuation alerts

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday