Nudging Consumers to Purchase More Sustainably


Article by Erez Yoeli: “Most consumers still don’t choose sustainable products when the option is available. Americans may claim to be willing to pay more for green energy, but while green energy is available in the majority of states — 35 out of 50 states or roughly 80% of American households as of 2018, at least — only 14% of households were even aware of the green option, and less than half of these households purchased it. Hybrids and electric vehicles are available nationwide, but still amount to just 10% of sales — 6.6% and 3.4%, respectively, according to S&P Global’s subscription services.

Now it may be that this virtue thinking-doing gap will eventually close. I hope so. But it will certainly need help, because in these situations there’s often an insidious behavioral dynamic at work that often stops stated good intentions from turning into actual good deeds…

Allow me to illustrate what I mean by “the plausible deniability effect” with an example from a now-classic behavioral economics study. Every year, around the holidays, Salvation Army volunteers collect donations for the needy outside supermarkets and other retail outlets. Researchers Justin Rao, Jim Andreoni, and Hanna Trachtmann teamed up with a Boston chapter of the Salvation Army to test ways of increasing donations.

Taking a supermarket that had two exit/entry points, the team randomly divided the volunteers into two groups. In one group, just one volunteer was assigned to stand in front of one door. For the other group, volunteers were stationed at both doors…(More)”.

A User’s Guide to the Periodic Table of Open Data


Guide by Stefaan Verhulst and Andrew Zahuranec: “Leveraging our research on the variables that determine Open Data’s Impact, The Open Data Policy Lab is pleased to announce the publication of a new report designed to assist organizations in implementing the elements of a successful data collaborative: A User’s Guide to The Periodic Table of Open Data.

The User’s Guide is a fillable document designed to empower data stewards and others seeking to improve data access. It can be used as a checklist and tool to weigh different elements based on their context and priorities. By completing the forms (offline/online), you will be able to take a more comprehensive and strategic view of what resources and interventions may be required.

Download and fill out the User’s Guide to operationalize the elements in your data initiative

In conjunction with the release of our User’s Guide, the Open Data Policy Lab is pleased to present a completely reworked version of our Periodic Table of Open Data Elements, first launched alongside in 2016. We sought to categorize the elements that matter in open data initiatives into five categories: problem and demand definition, capacity and culture, governance and standards, personnel and partnerships, and risk mitigation. More information on each can be found in the attached report or in the interactive table below.

Read more about the Periodic Table of Open Data Elements and how you can use it to support your work…(More)”.

One Data Point Can Beat Big Data


Essay by Gerd Gigerenzer: “…In my research group at the Max Planck Institute for Human Development, we’ve studied simple algorithms (heuristics) that perform well under volatile conditions. One way to derive these rules is to rely on psychological AI: to investigate how the human brain deals with situations of disruption and change. Back in 1838, for instance, Thomas Brown formulated the Law of Recency, which states that recent experiences come to mind faster than those in the distant past and are often the sole information that guides human decision. Contemporary research indicates that people do not automatically rely on what they recently experienced, but only do so in unstable situations where the distant past is not a reliable guide for the future. In this spirit, my colleagues and I developed and tested the following “brain algorithm”:

Recency heuristic for predicting the flu: Predict that this week’s proportion of flu-related doctor visits will equal those of the most recent data, from one week ago.

Unlike Google’s secret Flu Trends algorithm, this rule is transparent and can be easily applied by everyone. Its logic can be understood. It relies on a single data point only, which can be looked up on the website of the Center for Disease Control. And it dispenses with combing through 50 million search terms and trial-and-error testing of millions of algorithms. But how well does it actually predict the flu?

Three fellow researchers and I tested the recency rule using the same eight years of data on which Google Flu Trends algorithm was tested, that is, weekly observations between March 2007 and August 2015. During that time, the proportion of flu-related visits among all doctor visits ranged between one percent and eight percent, with an average of 1.8 percent visits per week (Figure 1). This means that if every week you were to make the simple but false prediction that there are zero flu-related doctor visits, you would have a mean absolute error of 1.8 percentage points over four years. Google Flu Trends predicted much better than that, with a mean error of 0.38 percentage points (Figure 2). The recency heuristic had a mean error of only 0.20 percentage points, which is even better. If we exclude the period where the swine flu happened, that is before the first update of Google Flu Trends, the result remains essentially the same (0.38 and 0.19, respectively)….(More)”.

Nowcasting daily population displacement in Ukraine through social media advertising data


Pre-Publication Paper by Douglas R. Leasure et al: “In times of crisis, real-time data mapping population displacements are invaluable for targeted humanitarian response. The Russian invasion of Ukraine on February 24, 2022 forcibly displaced millions of people from their homes including nearly 6m refugees flowing across the border in just a few weeks, but information was scarce regarding displaced and vulnerable populations who remained inside Ukraine. We leveraged near real-time social media marketing data to estimate sub-national population sizes every day disaggregated by age and sex. Our metric of internal displacement estimated that 5.3m people had been internally displaced away from their baseline administrative region by March 14. Results revealed four distinct displacement patterns: large scale evacuations, refugee staging areas, internal areas of refuge, and irregular dynamics. While this innovative approach provided one of the only quantitative estimates of internal displacement in virtual real-time, we conclude by acknowledging risks and challenges for the future…(More)”.

Voices in the Code: A Story about People, Their Values, and the Algorithm They Made


Book by David G. Robinson: “Algorithms–rules written into software–shape key moments in our lives: from who gets hired or admitted to a top public school, to who should go to jail or receive scarce public benefits. Today, high stakes software is rarely open to scrutiny, but its code navigates moral questions: Which of a person’s traits are fair to consider as part of a job application? Who deserves priority in accessing scarce public resources, whether those are school seats, housing, or medicine? When someone first appears in a courtroom, how should their freedom be weighed against the risks they might pose to others?

Policymakers and the public often find algorithms to be complex, opaque and intimidating—and it can be tempting to pretend that hard moral questions have simple technological answers. But that approach leaves technical experts holding the moral microphone, and it stops people who lack technical expertise from making their voices heard. Today, policymakers and scholars are seeking better ways to share the moral decisionmaking within high stakes software — exploring ideas like public participation, transparency, forecasting, and algorithmic audits. But there are few real examples of those techniques in use.

In Voices in the Code, scholar David G. Robinson tells the story of how one community built a life-and-death algorithm in a relatively inclusive, accountable way. Between 2004 and 2014, a diverse group of patients, surgeons, clinicians, data scientists, public officials and advocates collaborated and compromised to build a new transplant matching algorithm – a system to offer donated kidneys to particular patients from the U.S. national waiting list…(More)”.

China May Be Chasing Impossible Dream by Trying to Harness Internet Algorithms


Article by Karen Hao: “China’s powerful cyberspace regulator has taken the first step in a pioneering—and uncertain—government effort to rein in the automated systems that shape the internet.

Earlier this month, the Cyberspace Administration of China published summaries of 30 core algorithms belonging to two dozen of the country’s most influential internet companies, including TikTok owner ByteDance Ltd., e-commerce behemoth Alibaba Group Holding Ltd. and Tencent Holdings Ltd., owner of China’s ubiquitous WeChat super app.

The milestone marks the first systematic effort by a regulator to compel internet companies to reveal information about the technologies powering their platforms, which have shown the capacity to radically alter everything from pop culture to politics. It also puts Beijing on a path that some technology experts say few governments, if any, are equipped to handle….

One important question the effort raises, algorithm experts say, is whether direct government regulation of algorithms is practically possible.

The majority of today’s internet platform algorithms are based on a technology called machine learning, which automates decisions such as ad-targeting by learning to predict user behaviors from vast repositories of data. Unlike traditional algorithms that contain explicit rules coded by engineers, most machine-learning systems are black boxes, making it hard to decipher their logic or anticipate the consequences of their use.

Beijing’s interest in regulating algorithms started in 2020, after TikTok sought an American buyer to avoid being banned in the U.S., according to people familiar with the government’s thinking. When several bidders for the short-video platform lost interest after Chinese regulators announced new export controls on information-recommendation technology, it tipped off Beijing to the importance of algorithms, the people said…(More)”.

State of Gender Data


Report by Data2X: “Gender data is fundamental to achieving gender equality and the Sustainable Development Goals. It helps identify inequalities, illuminate a path forward, and monitor global progress. As recognition of its importance has grown over the last decade, the availability of gender data—and its use in decision-making—has improved.

Yet overlapping crises, from the COVID-19 pandemic to climate change and conflict, have imperiled progress on gender equality and the Sustainable Development Goals. In 2022, UN Secretary General Antonio Gutierrez declared that the Sustainable Development Goals are in need of rescue. The 2022 SDG Gender Index by EM2030 found little progress on global gender equality between 2015 and 2020, and a recent assessment by UN Women demonstrates that more than one quarter of the indicators needed to measure progress on gender equality are “far or very far” from 2030 targets….The State of Gender Data is an evolving Data2X publication and digital experience designed to highlight global progress and spur action on gender data. Data2X will update the initiative annually, providing insight into a new dimension of gender data. For our initial launch, we focus on examining funding trends and highlighting promising solutions and key commitments….(More)”.

New Theory for Increasingly Tangled Banks


Essay by Saran Twombly: “Decades before the COVID-19 pandemic demonstrated how rapidly infectious diseases could emerge and spread, the world faced the AIDS epidemic. Initial efforts to halt the contagion were slow as researchers focused on understanding the epidemiology of the virus. It was only by integrating epidemiological theory with behavioral theory that successful interventions began to control the spread of HIV. 

As the current pandemic persists, it is clear that similar applications of interdisciplinary theory are needed to inform decisions, interventions, and policy. Continued infections and the emergence of new variants are the result of complex interactions among evolution, human behavior, and shifting policies across space and over time. Due to this complexity, predictions about the pandemic based on data and statistical models alone—in the absence of any broader conceptual framework—have proven inadequate. Classical epidemiological theory has helped, but alone it has also led to limited success in anticipating surges in COVID-19 infections. Integrating evolutionary theory with data and other theories has revealed more about how and under what conditions new variants arise, improving such predictions.  

AIDS and COVID-19 are examples of complex challenges requiring coordination across families of scientific theories and perspectives. They are, in this sense, typical of many issues facing science and society today—climate change, biodiversity decline, and environmental degradation, to name a few. Such problems occupy interdisciplinary space and arise from no-analog conditions (i.e., situations to which there are no current equivalents), as what were previously only local perturbations trigger global instabilities. As with the pandemic crises, they involve interdependencies and new sources of uncertainty, cross levels of governance, span national boundaries, and include interactions at different temporal and spatial scales. 

Such problems, while impossible to solve from a single perspective, may be successfully addressed by integrating multiple theories. …(More)”.

Crowdsourced Politics


Book by Ariadne Vromen, Darren Halpin, Michael Vaughan: “This book focuses on online petitioning and crowdfunding platforms to demonstrate the everyday impact that digital communications have had on contemporary citizen participation. It argues that crowdsourced participation has become normalised and institutionalised into the repertoires of citizens and their organisations. 

To illustrate their arguments the authors use an original survey on acts of political engagement, undertaken with Australian citizens. Through detailed interviews and online analysis they show how advocacy organisations now use online petitions for strategic interventions and mobilisation. They also analyse the policy issues that mobilise citizens on crowdsourcing platforms, including a unique dataset of 17,000 petitions from the popular non-government platform, Change.org. Contrasting mass public concerns with the policy agenda of the government of the day shows there is a disjuncture and lack of responsiveness to crowdsourced citizen expression. Ultimately the book explores the long-term implications of citizen-led change for democracy. ..(More)”.

Building Trust to Reinforce Democracy


Main Findings from the 2021 OECD Survey on Drivers of Trust in Public Institutions: “What drives trust in government? This report presents the main findings of the first OECD cross-national survey on trust in government and public institutions, representing over 50 000 responses across 22 OECD countries. The survey measures government performance across five drivers of trust – reliability, responsiveness, integrity, openness, and fairness – and provides insights for future policy reforms. This investigation marks an important initiative by OECD countries to measure and better understand what drives people’s trust in public institutions – a crucial part of reinforcing democracy…(More)”.