Explore our articles
View All Results

Stefaan Verhulst

Article by Davidson Heath: “A century ago, two oddly domestic puzzles helped set the rules for what modern science treats as “real”: a Guinness brewer charged with quality control and a British lady insisting she can taste whether milk or tea was poured first.

Those stories sound quaint, but the machinery they inspired now decides which findings get published, promoted, and believed—and which get waved away as “not significant.” Instead of recognizing the limitations of statistical significance, fields including economics and medicine ossified around it, with dire consequences for science. In the 21st century, an obsession with statistical significance led to overprescription of both antidepressant drugs and a headache remedy with lethal side effects. There was another path we could have taken.

Sir Ronald Fisher succeeded 100 years ago in making statistical significance central to scientific investigation. Some scientists have argued for decades that blindly following his approach has led the scientific method down the wrong path. Today, statistical significance has brought many branches of science to a crisis of false-positive findings and bias.

At the beginning of the 20th century, the young science of statistics was blooming. One of the key innovations at this time was small-sample statistics—a toolkit for working with data that contain only a small number of observations. That method was championed by the great data scientist William S. Gosset. His ideas were largely ignored in favor of Fisher’s, and our ability to reach accurate and useful conclusions from data was harmed. It’s time to revive Gosset’s approach to experimentation and estimation…(More)”.

Our Obsession With Statistical Significance Is Ruining Science

New Public: “Last June, our Co-Director Eli Pariser laid out a bold new direction for New_ Public in this newsletter: “You may not think of this as social media, but most towns in America have some sort of general-purpose, locals-only digital forum: a Facebook group, a Google group, a Nextdoor neighborhood. These groups mostly run below the radar — they seem quotidian, maybe even boring. But according to Pew, “about half of US adults say they get their local news from online groups or forums,” more even than from newspapers. If they were strengthened into resilient, flourishing spaces, they could be crucial to reinforcing American democracy. That’s why we’re going to make them a major focus of our work at New_ Public.”

Since then, we’ve deeply explored local digital spaces, including Front Porch Forum’s inspiring example of what’s possible, and grim reminders of how the status quo is not sufficient with platforms like Nextdoor….

The main thing to know, maybe the most important thing, is that this is not just another social media app. Roundabout is a community space, built from the ground up with community leaders and neighbors.

We’re so excited to share more about Roundabout in the coming months, and we’re determined to grow and learn in public, along with you. Here’s a preview:

  • Roundabout is for building real relationships with people who actually live, work, and play in your community through trusted information sharing, genuine conversations, and mutual support, not posts competing for likes and virality.
  • Users can quickly find what they’re looking for via content that’s organized into topic-based channels, with a separate calendar to find events and a Guides section to find more evergreen resources.
  • Local stewards are supported and empowered. Each community will have its own vibes and culture.

The big platforms, some several decades old now, were built for profit — every decision, every design, optimized for uncontrolled growth and extraction. We’re doing something different.

As a project incubated within New_ Public, a nonprofit, Roundabout will grow incrementally, sustained by a diverse and balanced set of revenue sources. With business incentives aligned towards utility and everyday value, instead of engagement and relentless scale, we’re designing Roundabout to be shielded from the cycle of enshittificationThe ultimate goal is to build for social trust — every decision, every design, optimized to build bonds and increase belonging…(More)”.

Introducing Roundabout: built for neighbors, with neighbors

Review by James Gleick: “It’s hard to remember—impossible, if you’re under thirty—but there was an Internet before there was a World Wide Web. Experts at their computer keyboards (phones were something else entirely) chatted and emailed and used Unix protocols called “finger” and “gopher” to probe the darkness for pearls of information. Around 1992 people started talking about an “Information Superhighway,” in part because of a national program championed by then senator Al Gore to link computer networks in universities, government, and industry. A highway was different from a web, though. It took time for everyone to catch on.

The Internet was a messy joint effort, but the web had a single inventor: Tim Berners-Lee, a computer programmer at the CERN particle physics laboratory in Geneva. His big idea boiled down to a single word: links. He thought he could organize a free-form world of information by prioritizing interconnections between documents—which could be text, pictures, sound, or anything at all. Suddenly it seemed everyone was talking about webpages and web browsers; people turned away from their television sets and discovered the thrills of surfing the web.

It’s also hard to remember the idealism and ebullience of those days. The world online promised to empower individuals and unleash a wave of creativity. Excitement came in two main varieties. One was a sense of new riches—an abundance, a cornucopia of information goodies. The Library of Congress was “going online” and so was the Louvre. “Click the mouse,” urged the New York Times technology reporter John Markoff:

There’s a NASA weather movie taken from a satellite high over the Pacific Ocean. A few more clicks, and one is reading a speech by President Clinton, as digitally stored at the University of Missouri. Click-click: a sampler of digital music recordings as compiled by MTV. Click again, et voila: a small digital snapshot reveals whether a certain coffee pot in a computer science laboratory at Cambridge University in England is empty or full…(More)”

How the Web Was Lost

The Economist: “A good cover letter marries an applicant’s CV to the demands of the job. It helps employers identify promising candidates, particularly those with an employment history that is orthogonal to their career ambitions. And it serves as a form of signalling, demonstrating that the applicant cares enough about the position to go through a laborious process, rather than simply scrawling their desired salary at the top of a résumé and mass-mailing it to every business in the area.

Or, at least, it used to. The rise of large language models has changed the dynamic. Jobseekers can now produce a perfectly targeted cover letter, touching on all an advertisement’s stated requirements, at the touch of a button. Anyone and everyone can present themselves as a careful, diligent applicant, and do so hundreds of times a day. A new paper by Anaïs Galdin of Dartmouth College and Jesse Silbert of Princeton University uses data from Freelancer.com, a jobs-listing site, to work out what this means for the labour market.

Chart: The Economist

Comparing pre- and post-ChatGPT activity, two results stand out. The first is that cover letters have lengthened. In the pre-LLM era, the median one was 79 words long. (Since Freelancer.com attracts workers for one-off tasks, such letters are more to-the-point than those for full-time roles.) A few years later, post-ChatGPT, the median had risen to 104 words. In 2023 the site introduced its own AI tool, allowing users to craft a proposal without even having to leave the platform. The subset of applications written using the tool—the only ones that can be definitively labelled as AI-generated—are longer still, with a median length of 159 words, more than twice the human-written baseline…(More)”.

How AI is breaking cover letters

Article by Natasha Joshi: “…But what do we stand to lose when we privilege data science over human understanding?

C Thi Nguyen explains this through ‘value capture’. It is the process by which “our deepest values get captured by institutional metrics and then become diluted or twisted as a result. Academics aim at citation rates instead of real understanding; journalists aim for numbers of clicks instead of newsworthiness. In value capture, we outsource our values to large-scale institutions. Then all these impersonal, decontextualizing, de-expertizing filters get imported into our core values. And once we internalize those impersonalized values as our own, we won’t even notice what we’re overlooking.

One such thing being overlooked is care.

Interpersonal caregiving makes no sense from a market lens. The person with power and resources voluntarily expends them to further another person’s well-being and goals. The whole idea of care is oceanic and hard to wrap one’s head around. ‘Head’ being the operative word, because we are trying to understand care with our brains, when it really exists in our bodies and is often performed by our bodies.

Data tools have only inferior ways of measuring care, and by extension designing spaces and society for it.

Outside of specific, entangled relationships of care, humans also have an amorphous ability to feel that they are part of a larger whole. We are affiliated to humanity, the planet, and indeed the universe, and feel it in our bones rather than know it to be true in any objective way.

We see micro-entrepreneurs, inventors, climate stewards, and scores of people, both rich and poor, across circumstances who engage in collective care to make the world a better place. This kind of pro-sociality doesn’t always show in ways that is tangible or immediate or measurable.

Datavism, which we seem to have learned from bazaar, has convinced capital allocators that the impact of social programmes can and should be expressed arithmetically. And, based on those calculations, acts of care can be deemed successful or unsuccessful…(More)”.

Is data failing us?

Book by Ryan Calo: “Technology exerts a profound influence on contemporary society, shaping not just the tools we use but the environments in which we live. Law, uniquely among social forces, is positioned to guide and constrain the social fact of technology in the service of human flourishing. Yet, technology has proven disorienting to law: it presents itself as inevitable, makes a shell game of human responsibility, and daunts regulation. Drawing lessons from communities that critically assess emerging technologies, this book challenges the reflexive acceptance of innovation and critiques the widespread belief that technology is inevitable or ungovernable. It calls for a methodical, coherent approach to the legal analysis of technology—one capable of resisting technology’s disorienting qualities—thus equipping law to meet the demands of an increasingly technology-mediated world while helping to unify the field of law and technology itself…(More)”.

Law and Technology: A Methodical Approach

Article by Sarah Perez: “In a blog post, the Wikimedia Foundation, the organization that runs the popular online encyclopedia, called on AI developers to use its content “responsibly” by ensuring its contributions are properly attributed and that content is accessed through its paid product, the Wikimedia Enterprise platform.

The opt-in, paid product allows companies to use Wikipedia’s content at scale without “severely taxing Wikipedia’s servers,” the Wikimedia Foundation blog post explains. In addition, the product’s paid nature allows AI companies to support the organization’s nonprofit mission.

While the post doesn’t go so far as to threaten penalties or any sort of legal action for use of its material through scraping, Wikipedia recently noted that AI bots had been scraping its website while trying to appear human. After updating its bot-detection systems, the organization found that its unusually high traffic in May and June had come from AI bots that were trying to “evade detection.” Meanwhile, it said that “human page views” had declined 8% year-over-year.

Now Wikipedia is laying out its guidelines for AI developers and providers, saying that generative AI developers should provide attribution to give credit to the human contributors whose content it uses to create its outputs…(More)”.

Wikipedia urges AI companies to use its paid API, and stop scraping

OECD: “People in Latin American and the Caribbean are more optimistic than the OECD average about their governments’ ability to tackle complex global challenges, even as overall levels of trust in government remain lower according to a new OECD report.

OECD Survey on Drivers of Trust in Public Institutions in Latin America and the Caribbean: 2025 Results is the first regional initiative conducted under the Global Trust Survey Project. Covering ten Latin American and Caribbean countries*, the survey explores participants’ experiences with and expectations of their governments across key areas such as reliability, responsiveness, ability to manage long-term and global challenges, integrity, fairness, and openness.

Across the ten countries surveyed: on average, 35% of people express high or moderately high trust in the national government, while around half (48%) report low or no trust. Public trust varies significantly across countries and institutions. The armed forces, police, and media are more trusted than the judiciary, the civil service, legislatures and political parties…

Trust also varies across population groups, with trust in public institutions lower among those with financial, security and discrimination concerns, and among women and younger people.  Perceptions of political voice and partisanship are more strongly associated with trust gaps than socio-economic and demographic characteristics.

People who feel they have a say and the government listens to them are three times more likely to trust their government than those who do not. Currently, only 25% of respondents feel they have a say in government decisions, and just 36% believe national governments are held accountable by legislatures…(More)”.

Greater use of evidence and public input in policymaking could strengthen trust in Latin American and Caribbean public institutions

Book by Slavko Splichal: “…explores the evolving nature of publicness in the era of digital communication and social media saturation, arguing that the rise of the “gig public” represents a new paradigm that challenges the traditional conceptualization of the public in shaping social and political change. The gig public departs from traditional notions of publicness and the public, characterized by individuals’ spontaneous and less-structured engagement in public discourse. This engagement is often hampered by challenges in fostering sustained interaction and depth of discussion, due to the ephemeral nature of online interactions.
In particular, this monograph highlights the importance of customs, negotiations, and contracts that complement the normatively privileged public reasoning in public domains. It examines the transformations in the multifaceted nature of the public and its interrelationship with other social structures amid the shifting boundaries between public and private domains. In addition, it explores the evolution of conceptualizations of publicness and related concepts within critical theory, illustrating how contemporary shifts are redefining civic engagement and the essence of public life in a rapidly changing world. From these perspectives, the study is structured around three primary focal points: First, it analyzes how new information technologies and AI have altered human interactions within the public sphere. Second, it examines the impact of capitalist economic dynamics and governmentality strategies on reshaping the public realm, fundamentally altering the essence of the public and its democratic potential. Third, it explores how habitual and routine practices traditionally associated with the private sphere are now influencing the ongoing evolution of publicness…(More)”.

The Gig Public

Paper by Yael Borofsky et al: “Infrastructure inequities define modern cities. This Perspective reflects the viewpoint of a transdisciplinary group of co-authors working to advance infrastructural equity in low-income urban contexts. We argue that methodological silos and data fragmentation undermine the creation of a knowledge base to support coordinated action across diverse actors. As technological advances make it possible to ‘see’ informal settlements without engaging residents, our agenda advocates for (1) the integration of diverse methodological and epistemological traditions; (2) a focus on research that informs context-specific action; and (3) a commitment to ethical standards that center affected communities in efforts to improve infrastructure access…(More)”.

An agenda for data-rich, action-oriented, ethical research on infrastructure in informal settlements

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday