Explore our articles
View All Results

Stefaan Verhulst

Paper by Marcel Binz: “Establishing a unified theory of cognition has been an important goal in psychology. A first step towards such a theory is to create a computational model that can predict human behaviour in a wide range of settings. Here we introduce Centaur, a computational model that can predict and simulate human behaviour in any experiment expressible in natural language. We derived Centaur by fine-tuning a state-of-the-art language model on a large-scale dataset called Psych-101. Psych-101 has an unprecedented scale, covering trial-by-trial data from more than 60,000 participants performing in excess of 10,000,000 choices in 160 experiments. Centaur not only captures the behaviour of held-out participants better than existing cognitive models, but it also generalizes to previously unseen cover stories, structural task modifications and entirely new domains. Furthermore, the model’s internal representations become more aligned with human neural activity after fine-tuning. Taken together, our results demonstrate that it is possible to discover computational models that capture human behaviour across a wide range of domains. We believe that such models provide tremendous potential for guiding the development of cognitive theories, and we present a case study to demonstrate this…(More)”.

A foundation model to predict and capture human cognition

Book by Robert V. Moody, Ming-Dao Deng: “One of life’s most fundamental revelations is change. Presenting the fascinating view that pattern is the manifestation of change, this unique book explores the science, mathematics, and philosophy of change and the ways in which they have come to inform our understanding of the world. Through discussions on chance and determinism, symmetry and invariance, information and entropy, quantum theory and paradox, the authors trace the history of science and bridge the gaps between mathematical, physical, and philosophical perspectives. Change as a foundational concept is deeply rooted in ancient Chinese thought, and this perspective is integrated into the narrative throughout, providing philosophical counterpoints to customary Western thought. Ultimately, this is a book about ideas. Intended for a wide audience, not so much as a book of answers, but rather an introduction to new ways of viewing the world.

  • Combines mathematics and philosophy to explore the relationship between pattern and change
  • Uses examples from the world around us to illustrate how thinking has developed over time and in different parts of the world
  • Includes chapters on information, dynamics, symmetry, chance, order, the brain, and quantum mechanics, all introduced gently and building progressively toward deeper insights
  • Accompanied online by additional chapters and endnotes to explore topics of further interest..(More)”.
The Pattern of Change

Paper by Chiara Farronato, Andrey Fradkin & Tesary Lin: “We study the welfare consequences of choice architecture for online privacy using a field experiment that randomizes cookie consent banners. We study three ways in which firms or policymakers can influence choices: (1) nudging users through banner design to encourage acceptance of cookie tracking; (2) setting defaults when users dismiss banners; and (3) implementing consent decisions at the website versus browser level. Absent design manipulation, users accept all cookies more than half of the time. Placing cookie options behind extra clicks strongly influences choices, shifting users toward more easily accessible alternatives. Many users dismiss banners without making an explicit choice, underscoring the importance of default settings. Survey evidence further reveals substantial confusion about default settings. Using a structural model, we find that among consent policies requiring site-specific decisions, consumer surplus is maximized when consent interfaces clearly display all options and default to acceptance in the absence of an explicit choice. However, the welfare gains from optimizing banner design are much smaller than those from adopting browser-level consent, which eliminates the time costs of repeated decisions…(More)”.

Designing Consent: Choice Architecture and Consumer Welfare in Data Sharing

A Report of the Center for Open Data Enterprise (CODE): “The U.S. has had a strong bipartisan consensus that open federal data is an essential public good. Since 2009, initiatives by Presidents Obama, Trump, and Biden and two acts of Congress have made federal data more accessible, transparent, and useful. The current presidential administration has not challenged these established principles. However, the administration has altered many government data programs on an individual basis, often with the rationale that they do not align with the President’s priorities.
Civil society has responded to these actions with a data rescue movement to archive critical datasets and keep them publicly available. There is a good chance that the movement will be able to save most of the federal data that was available in January 2025.
The greater risk, however, is to the future. The data we have today will not be very useful in a year or two, and future data collections are now under threat. Since the start of the Trump Administration, the federal government has:
● Dismantled and defunded agencies that collect data mandated by Congress
● Discontinued specific data programs
● Defunded research that can be a source of open scientific data
● Disbanded advisory committees for the U.S. Census Bureau and other data-collecting
agencies and offices
● Removed data disaggregated by sexual orientation and gender identity
● Proposed changing established methods of data collection and publishing in some key
areas
These changes can have a major impact on the many institutions – including state and local governments, businesses, civil society organizations, and more – that depend on federal data for policymaking, decision making, and growth…(More)”

America’s Data Future: Towards A Roadmap for Action

Report by ProPublica: “The Internal Revenue Service is building a computer program that would give deportation officers unprecedented access to confidential tax data.

ProPublica has obtained a blueprint of the system, which would create an “on demand” process allowing Immigration and Customs Enforcement to obtain the home addresses of people it’s seeking to deport.

Last month, in a previously undisclosed dispute, the acting general counsel at the IRS, Andrew De Mello, refused to turn over the addresses of 7.3 million taxpayers sought by ICE. In an email obtained by ProPublica, De Mello said he had identified multiple legal “deficiencies” in the agency’s request.

Two days later, on June 27, De Mello was forced out of his job, people familiar with the dispute said. The addresses have not yet been released to ICE. De Mello did not respond to requests for comment, and the administration did not address questions sent by ProPublica about his departure.

The Department of Government Efficiency began pushing the IRS to provide taxpayer data to immigration agents soon after President Donald Trump took office. The tax agency’s acting general counsel refused and was replaced by De Mello, who Trump administration officials viewed as more willing to carry out the president’s agenda. Soon after, the Department of Homeland Security, ICE’s parent agency, and the IRS negotiated a “memorandum of understanding” that included specific legal guardrails to safeguard taxpayers’ private information.

In his email, De Mello said ICE’s request for millions of records did not meet those requirements, which include having a written assurance that each taxpayer whose address is being sought was under active criminal investigation.

“There’s just no way ICE has 7 million real criminal investigations, that’s a fantasy,” said a former senior IRS official who had been advising the agency on this issue. The demands from the DHS were “unprecedented,” the official added, saying the agency was pressing the IRS to do what amounted to “a big data dump.”

In the past, when law enforcement sought IRS data to support its investigations, agencies would give the IRS the full legal name of the target, an address on file and an explanation of why the information was relevant to a criminal inquiry. Such requests rarely involved more than a dozen people at a time, former IRS officials said.

Danny Werfel, IRS commissioner during the Biden administration, said the privacy laws allowing federal investigators to obtain taxpayer data have never “been read to open the door to the sharing of thousands, tens of thousands, or hundreds of thousands of tax records for a broad-based enforcement initiative.”

A spokesperson for the White House said the planned use of IRS data was legal and a means of fulfilling Trump’s campaign pledge to carry out mass deportations of “illegal criminal aliens.”

Taxpayer data is among the most confidential in the federal government and is protected by strict privacy laws, which have historically limited its transfer to law enforcement and other government agencies. Unauthorized disclosure of taxpayer return information is a felony that can carry a penalty of up to five years in prison…(More)”.

The IRS Is Building a Vast System to Share Millions of Taxpayers’ Data With ICE

Paper by Christophe Gouache: “.. we’ll propose to draw a new guiding approach to policymaking inspired by design. To do so, we will build upon the initial critics of the policy cycle model such as the one of Lasswell drawn in 1957 (still in use today as a reference model) which, despite its clarity and simplicity, is purely theoretical (too linear, too static and too rational) and never takes place in the « real world » and question the actionability of policy threads or streams model as proposed by Howlett in 2015. Then, we’ll look into how, in practice, designers have, through practice, « tinted » policy making with their own methods to finally, extract and draw a new model for policy making, one which would build upon the design thinking & doing methodology (questioning the double diamond of the UK Design Council) but also the design « spirit » (capacity to improvise, to detour, to navigate at ease through uncertainty, etc.)…(More)”.

What if design could transform the way we think and make public policies? Proposing a new model: the Policy Design Journey

Report by the National Academies of Sciences, Engineering, and Medicine: “Intergenerational mobility is an important measure of well-being that underlies a fundamental value: that anyone should be able to succeed economically based on their own merits, regardless of their circumstances. This has been a value held by many Americans throughout U.S. history, even as many observers may rightly argue that it has been, at times and for many groups, severely constrained. For all the emphasis placed on mobility in the United States, the chances Americans have of doing better than their parents and their chances of succeeding economically regardless of the advantages of birth are not higher than in other wealthy countries.

This report provides a forward-looking framework for data, research, and policy initiatives to boost upward mobility and better fulfill promises of opportunity and advancement for all members of U.S. society. The report focuses on key domains that shape mobility, including early life and family; the spaces and places where people live and work; postsecondary education; and credit, wealth, and debt. It also discusses the data infrastructure needed to support an extensive research agenda on economic and social mobility…(More)”.

Economic and Social Mobility: New Directions for Data, Research, and Policy

Paper by Wolfram Barfuss et al: “Cooperation at scale is critical for achieving a sustainable future for humanity. However, achieving collective, cooperative behavior—in which intelligent actors in complex environments jointly improve their well-being—remains poorly understood. Complex systems science (CSS) provides a rich understanding of collective phenomena, the evolution of cooperation, and the institutions that can sustain both. Yet, much of the theory in this area fails to fully consider individual-level complexity and environmental context—largely for the sake of tractability and because it has not been clear how to do so rigorously. These elements are well captured in multiagent reinforcement learning (MARL), which has recently put focus on cooperative (artificial) intelligence. However, typical MARL simulations can be computationally expensive and challenging to interpret. In this perspective, we propose that bridging CSS and MARL affords new directions forward. Both fields can complement each other in their goals, methods, and scope. MARL offers CSS concrete ways to formalize cognitive processes in dynamic environments. CSS offers MARL improved qualitative insight into emergent collective phenomena. We see this approach as providing the necessary foundations for a proper science of collective, cooperative intelligence. We highlight work that is already heading in this direction and discuss concrete steps for future research…(More)”.

Collective cooperative intelligence


The Economist: “… artificial intelligence is transforming the way that people navigate the web. As users pose their queries to chatbots rather than conventional search engines, they are given answers, rather than links to follow. The result is that “content” publishers, from news providers and online forums to reference sites such as Wikipedia, are seeing alarming drops in their traffic.

As AI changes how people browse, it is altering the economic bargain at the heart of the internet. Human traffic has long been monetised using online advertising; now that traffic is drying up. Content producers are urgently trying to find new ways to make AI companies pay them for information. If they cannot, the open web may evolve into something very different.

Since the launch of ChatGPT in late 2022, people have embraced a new way to seek information online. OpenAI, maker of ChatGPT, says that around 800m people use the chatbot. It is the most popular download on the iPhone app store. Apple said that conventional searches in its Safari web browser had fallen for the first time in April, as people posed their questions to AI instead. OpenAI is soon expected to launch a browser of its own. Its rise is so dramatic that a Hollywood adaptation is in the works.

As OpenAI and other upstarts have soared, Google, which has about 90% of the conventional search market in America, has added AI features to its own search engine in a bid to keep up. Last year it began preceding some search results with AI-generated “overviews”, which have since become ubiquitous. In May it launched “AI mode”, a chatbot-like version of its search engine. The company promises that, with AI, users can “let Google do the Googling for you”.

Chart: The Economist

Yet as Google does the Googling, humans no longer visit the websites from which the information is gleaned. Similarweb, which measures traffic to more than 100m web domains, estimates that worldwide search traffic (by humans) fell by about 15% in the year to June. Although some categories, such as hobbyists’ sites, are doing fine, others have been hit hard (see chart). Many of the most affected are just the kind that might have commonly answered search queries. Science and education sites have lost 10% of their visitors. Reference sites have lost 15%. Health sites have lost 31%…(More)”.

AI is killing the web. Can anything save it?

Paper by Liz Richardson, Catherine Durose, Lucy Kimbell and Ramia Mazé: “Design for policy’ is a prominent framing of the intersection between policy and design. Here, we ask, if design is ‘for’ policy, then what exactly is it doing? We make a critique of literature that explains the interaction of design and policy by listing practices (prototyping or visualisation, for example) but that misses the reasons why those practices are being used. We build on and advance scholarship that anchors design in relation to the demands, constraints and politics of policy making, taking account of the quite different forms a relationship between design (as a thing) and policy design (as a process) can have. Within this debate we propose that design’s relationship to policy is not always in service to (‘for’), but also sometimes ‘with’, and even sometimes ‘against’. We set out an original typology which differentiates roles of design in policy along the lines of their ultimate purpose, scope and terms on which design and policy interact. We identify an instrumental relationship, in which design is a tool to support achieving specified goals of policy making; an improvisational relationship, seeing design as a practice enabling policy making to be more open in the face of unfolding events and experiences; and a generative relationship where design facilitates the re-envisioning of policy making. Through our analysis and proposed typology, we aim to address overly specific and overly homogenising understandings of design in the policy space, enabling a more critical understanding of the different intents and implications at play within the ‘design turn’ in policy…(More)”.

How do policy and design intersect? Three relationships

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday