Explore our articles
View All Results

Stefaan Verhulst

Julian Baggini at the Financial Times: “A decade ago, Cass Sunstein and Richard Thaler’s book Nudge was on the desk of every serious politician and policy wonk. Its central thesis was alluringly simple: by changing the environment in which we make decisions — the “choice architecture” — people could be encouraged to do things that were good for them and for society without governments compelling them to do anything.

The idea hit the liberal sweet-spot, promising maximum social impact for minimal interference with personal freedom. In 2010, Britain’s government set up its Behavioural Insights Team — popularly known as the “nudge unit” — to put these ideas into practice.

Around the world, others followed. Sunstein is justly proud that 10m poor American children now get free breakfast and lunch during the academic year as a result of just one such intervention making enrolment for free school meals automatic.

Ten years on, Sunstein has produced two new books to win over the unconverted and boost the faith of true believers. One, On Freedom, is a tiny, commuter-friendly pamphlet between hard covers. The other, Trusting Nudges, co-authored with the behavioural economist Lucia A Reisch, is a short, thoughtful, measured and important analysis of what citizens actually think about nudging and why that matters — albeit with the dry, academic furniture of endless tables, footnotes and technical appendices.

Despite the stylistic gulf between them, the two books are best read together as a response to those who would like to give nudges the nudge, claiming that they are covert, manipulative, an insult to human agency and place too much trust in governments and too little on human reason. Not only that, but for all the hype, nudges only work at the margins, delivering relatively minor results without having any major impact on poverty, inequity or inequality.

On Freedom economically and elegantly takes apart the accusation that nudges undermine liberty. Sunstein rightly points out that a nudge is only a nudge by definition if it leaves the nudged able to choose otherwise. For example, the system adopted by several jurisdictions to put people on organ donation registers by default carries with it the right to opt out. Nor are the best nudges covert.

There may not be a sign at the canteen telling you that healthy foods have been put at the front because that’s where you’re more likely to choose them but organisations that adopt this as a policy can and should do so openly. Sunstein’s most important argument is that “we cannot wish choice architecture away”: something has to be on the supermarket shelves that people tend to take more from, something has to be the default for benefit claims. The question is not whether we nudge but how we do so: with forethought or without….(More)”

How nudge theory is ageing well

Gwendolyn Wu at San Francisco Chronicle: “In an effort to shorten emergency response times in San Francisco, the city announced on Monday that it is now using location data from RapidSOS, a New York-based public safety tech company, and ride-hailing company Uber to improve location coordinates generated from 911 calls.

An increasing amount of emergency calls are made from cell phones, said Michelle Cahn, RapidSOS’s director of community engagement. The new technology should allow emergency responders to narrow down the location of such callers and replace existing 911 technology that was built for landlines and tied to home addresses.

Cell phone location data currently given to dispatchers when they receive a 911 call can be vague, especially if the person can’t articulate their exact location, according to the Department of Emergency Management.

But if a dispatcher can narrow down where the emergency is happening, that increases the chance of a timely response and better result, Cahn said.

“It doesn’t matter what’s going on with the emergency if we don’t know where it is,” she said.

RapidSOS shares its location data — collected by Apple and Google for their in-house map apps — free of charge to public safety agencies. San Francisco’s 911 call center adopted the data service in September 2018.

The Federal Communications Commission estimates agencies could save as many as 10,000 lives a year if they shave a minute off response times. Federal officials issued new rules to improve wireless 911 calls in 2015, asking mobile carriers to provide more accurate locations to call centers. Carriers are required to find a way to triangulate the caller’s location within 50 meters — a much smaller radius than the eight blocks city officials were initially presented in October when the caller dialed 911…(More)”.

San Francisco teams up with Uber, location tracker on 911 call responses

Book by Oldrich Bubak and Henry Jacek: “Centering on public discourse and its fundamental lapses, this book takes a unique look at key barriers to social and political advancement in the information age. Public discourse is replete with confident, easy to manage claims, intuitions, and other shortcuts; outstanding of these is trivialization, the trend to distill multifaceted dilemmas to binary choices, neglect the big picture, gloss over alternatives, or filter reality through a lens of convenience—leaving little room for nuance and hence debate.

Far from superficial, such lapses are symptoms of deeper, intrinsically connected shortcomings inviting further attention. Focusing primarily on industrialized democracies, the authors take their readers on a transdisciplinary journey into the world of trivialization, engaging as they do so the intricate issues borne of a modern environment both enabled and constrained by technology. Ultimately, the authors elaborate upon the emerging counterweights to conventional worldviews and the paradigmatic alternatives that promise to help open new avenues for progress….(More)”.

Trivialization and Public Opinion: Slogans, Substance, and Styles of Thought in the Age of Complexity

Neil Munshi in the Financial Times: “The world’s largest drone delivery network, ferrying 150 different medicines and vaccines, as well as blood, to 2,000 clinics in remote parts of Ghana, is set to be announced on Wednesday.

The network represents a big expansion for the Silicon Valley start-up Zipline, which began delivering blood in Rwanda in 2016 using pilotless, preprogrammed aircraft. The move, along with a new agreement in Rwanda signed in December, takes the company beyond simple blood distribution to more complicated vaccine and plasma deliveries.

“What this is going to show is that you can reach every GPS co-ordinate, you can serve everybody,” said Keller Rinaudo, Zipline chief executive. “Every human in that region or country [can be] within a 15-25 minute delivery of any essential medical product — it’s a different way of thinking about universal coverage.”

Zipline will deliver vaccines for yellow fever, polio, diptheria and tetanus which are provided by the World Health Organisation’s Expanded Project on Immunisation. The WHO will also use the company’s system for future mass immunisation programmes in Ghana.

Later this year, Zipline has plans to start operations in the US, in North Carolina, and in south-east Asia. The company said it will be able to serve 100m people within a year, up from the 22m that its projects in Ghana and Rwanda will cover.

In Ghana, Zipline said health workers will receive deliveries via a parachute drop within about 30 minutes of placing their orders by text message….(More)”.

Drones to deliver medicines to 12m people in Ghana

Afua Bruce at the Hill: “The city of Los Angeles recently released three free apps for its citizens: one to report broken street lighting, one to make 311 requests and one to get early alerts about earthquakes. Though it may seem like the city is just following a trend to modernize, the apps are part of a much larger effort to spread awareness of the more than 1,100 datasets that the city has publicized for citizens to view, analyze and share. In other words, the city has officially embraced the open data movement.

In the past few years, communities across the country have realized the power of data once only available to government. Often, the conversation about data focuses on criminal justice, because the demand for this data is being met by high-profile projects like Kamala Harris’ Open Justice Initiative, which makes California criminal justice data available to the citizenry and  the Open Data Policing Project, which provides a publicly searchable database of stop, search and use-of-force data. But the possibilities for data go far beyond justice and show the possibility for use in a variety of spaces, such as efforts to preserve local wildlifetrack potholes and  understand community health trends….(More)”.

Open data promotes citizen engagement at the local level

Rebecca Winthrop at Brookings: “Much has been written on the worrisome trends in Americans’ faith and participation in our nation’s democracy. According to the World Values Survey, almost 20 percent of millennials in the U.S. think that military rule or an authoritarian dictator is a “fairly good” form of government, and only 29 percent believe that living in a country that is governed democratically is “absolutely important.” In the last year, trust in American democratic institutions has dropped—only 53 percent of Americans view American democracy positively. This decline in faith and participation in our democracy has been ongoing for some time, as noted in the 2005 collection of essays, “Democracy At Risk: How Political Choices Undermine Citizen Participation, and What We Can Do About It.” The essays chart the “erosion of the activities and capacities of citizenship” from voting to broad civic engagement over the past several decades.

While civil society and government have been the actors most commonly addressing this worrisome trend, is there also a constructive role for the private sector to play? After all, compared to other options like military or authoritarian rule, a functioning democracy is much more likely to provide the conditions for free enterprise that business desires. One only has to look to the current events in Venezuela for a quick reminder of this.

Many companies do engage in a range of activities that broadly support civic engagement, from dedicating corporate social responsibility (CSR) dollars to civically-minded community activities to supporting employee volunteerism. These are worthy activities and should certainly continue, but given the crisis of faith in the foundations of our democratic process, the private sector could play a much bigger role in helping support a movement for renewed understanding of and participation in our political process. Many of the private sector’s most powerful tools for doing this lie not inside companies’ CSR portfolios but in their unique expertise in selling things. Every day companies leverage their expertise in influence—from branding to market-segmentation—to get Americans to use their products and services. What if this expertise were harnessed toward promoting civic understanding and engagement?

Companies could play a particularly useful role by tapping new resources to amplify existing good work and build increasing interest in civic engagement. Two ways of doing this could include the below….(More)”

Selling civic engagement: A unique role for the private sector?

News Release from the Santa Fee Institute: “In nature, group decisions are often a matter of life or death. At first glance, the way certain groups of animals like minnows branch off into smaller sub-groups might seem counterproductive to their survival. After all, information about, say, where to find some tasty fish roe or which waters harbor more of their predators, would flow more freely and seem to benefit more minnows if the school of fish behaved as a whole. However, new research published in Philosophical Transactions of the Royal Society B sheds light on the complexity of collective decision-making and uncovers new insights into the benefits of the internal structure of animal groups.

In their paper, Albert Kao, a Baird Scholar and Omidyar Fellow at the Santa Fe Institute, and Iain Couzin, Director of the Max Planck Institute for Ornithology and Chair of Biodiversity and Collective Behavior at the University of Konstanz, simulate the information-sharing patterns of animals that prefer to interact with certain individuals over others. The authors’ modeling of such animal groups upends previously held assumptions about internal group structure and improves upon our understanding of the influence of group organization and environment on both the collective decision-making process and its accuracy.

Modular — or cliquey — group structure isolates the flow of communication between individuals, so that only certain animals are privy to certain pieces of information. “A feature of modular structure is that there’s always information loss,” says Kao, “but the effect of that information loss on accuracy depends on the environment.”

In simple environments, the impact of these modular groups is detrimental to accuracy, but when animals face many different sources of information, the effect is actually the opposite. “Surprisingly,” says Kao, “in complex environments, the information loss even helps accuracy in a lot of situations.” More information, in this case, is not necessarily better.

“Modular structure can have a profound — and unexpected — impact on the collective intelligence of groups,” says Couzin. “This may indeed be one of the reasons that we see internal structure in so many group-living species, from schooling fish and flocking birds to wild primate groups.”

Potentially, these new observations could be applied to many different kinds of social networks, from the migration patterns of birds to the navigation of social media landscapes to the organization of new companies, deepening our grasp of complex organization and collective behavior….(More)”.

(The paper, “Modular structure within groups causes information loss but can improve decision accuracy,” is part of a theme issue in the Philosophical Transactions of the Royal Society B entitled “Liquid Brains, Solid Brains: How distributed cognitive architectures process information.” The issue was inspired by a Santa Fe Institute working group and edited by Ricard Solé (Universitat Pompeu Fabra), Melanie Moses (University of New Mexico), and Stephanie Forrest (Arizona State University).

Group decisions: When more information isn’t necessarily better

Paper by Timotheus Kampik and Amro Najjar: “The spread of radical opinions, facilitated by homophilic Internet communities (echo chambers), has become a threat to the stability of societies around the globe. The concept of choice architecture–the design of choice information for consumers with the goal of facilitating societally beneficial decisions–provides a promising (although not uncontroversial) general concept to address this problem.

The choice architecture approach is reflected in recent proposals advocating for recommender systems that consider the societal impact of their recommendations and not only strive to optimize revenue streams.

However, the precise nature of the goal state such systems should work towards remains an open question. In this paper, we suggest that this goal state can be defined by considering target opinion spread in a society on different topics of interest as a multivariate normal distribution; i.e., while there is a diversity of opinions, most people have similar opinions on most topics. We explain why this approach is promising, and list a set of crossdisciplinary research challenges that need to be solved to advance the idea….(More)”.

Technology-facilitated Societal Consensus

Mark Phillips and Bartha M. Knoppers in the Journal of Law, Medicine and Ethics: “Open science has recently gained traction as establishment institutions have come on-side and thrown their weight behind the movement and initiatives aimed at creation of information commons. At the same time, the movement’s traditional insistence on unrestricted dissemination and reuse of all information of scientific value has been challenged by the movement to strengthen protection of personal data. This article assesses tensions between open science and data protection, with a focus on the GDPR.

Powerful institutions across the globe have recently joined the ranks of those making substantive commitments to “open science.” For example, the European Commission and the NIH National Cancer Institute are supporting large-scale collaborations, such as the Cancer Genome Collaboratory, the European Open Science Cloud, and the Genomic Data Commons, with the aim of making giant stores of genomic and other data readily available for analysis by researchers. In the field of neuroscience, the Montreal Neurological Institute is midway through a novel five-year project through which it plans to adopt open science across the full spectrum of its research. The commitment is “to make publicly available all positive and negative data by the date of first publication, to open its biobank to registered researchers and, perhaps most significantly, to withdraw its support of patenting on any direct research outputs.” The resources and influence of these institutions seem to be tipping the scales, transforming open science from a longstanding aspirational ideal into an existing reality.

Although open science lacks any standard, accepted definition, one widely-cited model proposed by the Austria-based advocacy effort openscienceASAP describes it by reference to six principles: open methodology, open source, open data, open access, open peer review, and open educational resources. The overarching principle is “the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process.” This article adopts this principle as a working definition of open science, with a particular emphasis on open sharing of human data.

As noted above, many of the institutions committed to open science use the word “commons” to describe their initiatives, and the two concepts are closely related. “Medical information commons” refers to “a networked environment in which diverse sources of health, medical, and genomic information on large populations become widely shared resources.” Commentators explicitly link the success of information commons and progress in the research and clinical realms to open science-based design principles such as data access and transparent analysis (i.e., sharing of information about methods and other metadata together with medical or health data).

But what legal, as well as ethical and social, factors will ultimately shape the contours of open science? Should all restrictions be fought, or should some be allowed to persist, and if so, in what form? Given that a commons is not a free-for-all, in that its governing rules shape its outcomes, how might we tailor law and policy to channel open science to fulfill its highest aspirations, such as universalizing practical access to scientific knowledge and its benefits, and avoid potential pitfalls? This article primarily concerns research data, although passing reference is also made to the approach to the terms under which academic publications are available, which are subject to similar debates….(More)”.

Whose Commons? Data Protection as a Legal Limit of Open Science

Paper by Angela G. Villanueva et al: “Advances in technologies and biomedical informatics have expanded capacity to generate and share biomedical data. With a lens on genomic data, we present a typology characterizing the data-sharing landscape in biomedical research to advance understanding of the key stakeholders and existing data-sharing practices. The typology highlights the diversity of data-sharing efforts and facilitators and reveals how novel data-sharing efforts are challenging existing norms regarding the role of individuals whom the data describe.

Technologies such as next-generation sequencing have dramatically expanded capacity to generate genomic data at a reasonable cost, while advances in biomedical informatics have created new tools for linking and analyzing diverse data types from multiple sources. Further, many research-funding agencies now mandate that grantees share data. The National Institutes of Health’s (NIH) Genomic Data Sharing (GDS) Policy, for example, requires NIH-funded research projects generating large-scale human genomic data to share those data via an NIH-designated data repository such as the Database of Geno-types and Phenotypes (dbGaP). Another example is the Parent Project Muscular Dystrophy, a non-profit organization that requires applicants to propose a data-sharing plan and take into account an applicant’s history of data sharing.

The flow of data to and from different projects, institutions, and sectors is creating a medical information commons (MIC), a data-sharing ecosystem consisting of networked resources sharing diverse health-related data from multiple sources for research and clinical uses. This concept aligns with the 2018 NIH Strategic Plan for Data Science, which uses the term “data ecosystem” to describe “a distributed, adaptive, open system with properties of self-organization, scalability and sustainability” and proposes to “modernize the biomedical research data ecosystem” by funding projects such as the NIH Data Commons. Consistent with Elinor Ostrom’s discussion of nested institutional arrangements, an MIC is both singular and plural and may describe the ecosystem as a whole or individual components contributing to the ecosystem. Thus, resources like the NIH Data Commons with its associated institutional arrangements are MICs, and also form part of the larger MIC that encompasses all such resources and arrangements.

Although many research funders incentivize data sharing, in practice, progress in making biomedical data broadly available to maximize its utility is often hampered by a broad range of technical, legal, cultural, normative, and policy challenges that include achieving interoperability, changing the standards for academic promotion, and addressing data privacy and security concerns. Addressing these challenges requires multi-stakeholder involvement. To identify relevant stakeholders and advance understanding of the contributors to an MIC, we conducted a landscape analysis of existing data-sharing efforts and facilitators. Our work builds on typologies describing various aspects of data sharing that focused on biobanks, research consortia, or where data reside (e.g., degree of data centralization).7 While these works are informative, we aimed to capture the biomedical data-sharing ecosystem with a wider scope. Understanding the components of an MIC ecosystem and how they interact, and identifying emerging trends that test existing norms (such as norms respecting the role of individuals from whom the data describe), is essential to fostering effective practices, policies and governance structures, guiding resource allocation, and promoting the overall sustainability of the MIC….(More)”

Characterizing the Biomedical Data-Sharing Landscape

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday