The Future of Nudging Will Be Personal


Essay by Stuart Mills: “Nudging, now more than a decade old as an intervention tool, has become something of a poster child for the behavioral sciences. We know that people don’t always act in their own best interest—sometimes spectacularly so—and nudges have emerged as a noncoercive way to live better in a world shaped by our behavioral foibles.

But with nudging’s maturity, we’ve also begun to understand some of the ways that it falls short. Take, for instance, research by Linda Thunström and her colleagues. They found that “successful” nudges can actually harm subgroups of a population. In their research, spendthrifts (those who spend freely) spent less when nudged, bringing them closer to optimal spending. But when given the same nudge, tightwads also spent less, taking them further from the optimal.

While a nudge might appear effective because a population benefited on average, at the individual level the story could be different. Should nudging penalize people that differ from the average just because, on the whole, a policy would benefit the population? Though individual versus population trade-offs are part and parcel to policymaking, as our ability to personalize advances, through technology and data, these trade-offs seem less and less appealing….(More)”.

The Techlash and Tech Crisis Communication


Book by Nirit Weiss-Blatt: “This book provides an in-depth analysis of the evolution of tech journalism. The emerging tech-backlash is a story of pendulum swings: We are currently in tech-dystopianism after a long period spent in tech-utopianism. Tech companies were used to ‘cheerleading’ coverage of product launches. This long tech-press honeymoon ended, and was replaced by a new era of mounting criticism focused on tech’s negative impact on society. When and why did tech coverage shift? How did tech companies respond to the rise of tech criticism?

The book depicts three main eras: Pre-Techlash, Techlash, and Post-Techlash. The reader is taken on a journey from computer magazines, through tech blogs to the upsurge of tech investigative reporting. It illuminates the profound changes in the power dynamics between the media and the tech giants it covers.

The interplay between tech journalism and tech PR was underexplored. Through analyses of both tech media and the corporates’ crisis responses, this book examines the roots and characteristics of the Techlash, and provides explanations to ‘How did we get here?’. Insightful observations by tech journalists and tech public relations professionals are added to the research data, and together – they tell the story of the TECHLASH. It includes theoretical and practical implications for both tech enthusiasts and critics….(More)”.

A new approach to problem-solving across the Sustainable Development Goals


Alexandra Bracken, John McArthur, and Jacob Taylor at Brookings: “The economic, social, and environmental challenges embedded throughout the world’s 17 Sustainable Development Goals (SDGs) will require many breakthroughs from business as usual. COVID-19 has only underscored the SDGs’ central message that the underlying problems are both interconnected and urgent, so new mindsets are required to generate faster progress on many fronts at once. Our recent report, 17 Rooms: A new approach to spurring action for the Sustainable Development Goals, describes an effort to innovate around the process of SDG problem-solving itself.

17 Rooms aims to advance problem-solving within and across all the SDGs. As a partnership between Brookings and The Rockefeller Foundation, the first version of the undertaking was convened in September 2018, as a single meeting on the eve of the U.N. General Assembly in New York. The initiative has since evolved into a two-pronged effort: an annual flagship process focused on global-scale policy issues and a community-level process in which local actors are taking 17 Rooms methods into their own hands.

In practical terms, 17 Rooms consists of participants from disparate specialist communities each meeting in their own “Rooms,” or working groups, one for each SDG. Each Room is tasked with a common assignment of identifying cooperative actions they can take over the subsequent 12-18 months. Emerging ideas are then shared across Rooms to spot opportunities for collaboration.

The initiative continues to evolve through ongoing experimentation, so methods are not overly fixed, but three design principles help define key elements of the 17 Rooms mindset:

  1. All SDGs get a seat at the table. Insights, participants, and priorities are valued equally across all the specialist communities focused on individual dimensions of the SDGs
  2. Take a next step, not the perfect step. The process encourages participants to identify—and collaborate on—actions that are “big enough to matter, but small enough to get done”
  3. Conversations, not presentations. Discussions are structured around collaboration and peer-learning, aiming to focus on what’s best for an issue, not any individual organization

These principles appear to contribute to three distinct forms of value: the advancement of action, the generation of insights, and a strengthened sense of community among participants….(More)”.

Legislative Performance Futures


Article by Ben Podgursky on “Incentivize Good Laws by Monetizing the Verdict of History”….There are net-positive legislative policies which legislators won’t enact, because they only help people in the medium to far future.  For example:

  • Climate change policy
  • Infrastructure investments and mass-transit projects
  • Debt control and social security reform
  • Child tax credits

The (infrequent) times reforms on these issues are legislated — which happens rarely compared to their future value — they are passed not because of the value provided to future generations, but because of the immediate benefit to voters today:

  • Infrastructure investment goes to “shovel ready” projects, with an emphasis on short-term job creation, even when the prime benefit is to future GDP.  For example, Dams constructed in the 1930s (the Hoover Dam, the TVA) provide immense value today, but the projects only happened in order to create tens of thousands of jobs.
  • Climate change legislation is usually weakly directed.  Instead of policies which incur significant long-term benefits but short-term costs (ie, carbon taxes), “green legislation” aims to create green jobs and incentivize rooftop solar (reducing power bills today).
  • (small) child tax credits are passed to help parents today, even though the vastly larger benefit is incurred by children who exist because the marginal extra cash helped their parents afford an extra child.

On the other hand, reforms which provide nobenefit to today’s voter do not happen; this is why the upcoming Social Security Trust Fund shortfall will likely not be fixed until benefits are reduced and voters are directly impacted.

The issue is that while the future reaps the benefits or failures of today’s laws, people of the future cannot vote in today’s elections.  In fact, in almost no circumstances does the future have any ability to meaningfully reward or punish past lawmakers; there are debates today about whether to remove statues and rename buildings dedicated to those on the wrong side of history, actions which even proponents acknowledge as entirely symbolic….(More)”.

The Nature of Truth


Book edited by Michael P. Lynch, Jeremy Wyatt, Junyeol Kim and Nathan Kellen: “The question “What is truth?” is so philosophical that it can seem rhetorical. Yet truth matters, especially in a “post-truth” society in which lies are tolerated and facts are ignored. If we want to understand why truth matters, we first need to understand what it is. The Nature of Truth offers the definitive collection of classic and contemporary essays on analytic theories of truth. This second edition has been extensively revised and updated, incorporating both historically central readings on truth’s nature as well as up-to-the-moment contemporary essays. Seventeen new chapters reflect the current trajectory of research on truth.

Highlights include new essays by Ruth Millikan and Gila Sher on correspondence theories; a new essay on Peirce’s theory by Cheryl Misak; seven new essays on deflationism, laying out both theories and critiques; a new essay by Jamin Asay on primitivist theories; and a new defense by Kevin Scharp of his replacement theory, coupled with a probing critique of replacement theories by Alexis Burgess. Classic essays include selections by J. L. Austin, Donald Davidson, William James, W. V. O. Quine, and Alfred Tarski….(More)”.

Policy 2.0 in the Pandemic World: What Worked, What Didn’t, and Why


Blog by David Osimo: “…So how, then, did these new tools perform when confronted with the once-in-a-lifetime crisis of a vast global pandemic?

It turns out, some things worked. Others didn’t. And the question of how these new policymaking tools functioned in the heat of battle is already generating valuable ammunition for future crises.

So what worked?

Policy modelling – an analytical framework designed to anticipate the impact of decisions by simulating the interaction of multiple agents in a system rather than just the independent actions of atomised and rational humans – took centre stage in the pandemic and emerged with reinforced importance in policymaking. Notably, it helped governments predict how and when to introduce lockdowns or open up. But even there uptake was limited. A recent survey showed that of the 28 models used in different countries to fight the pandemic were traditional, and not the modern “agent-based models” or “system dynamics” supposed to deal best with uncertainty. Meanwhile, the concepts of system science was becoming prominent and widely communicated. It became quickly clear in the course of the crisis that social distancing was more a method to reduce the systemic pressure on the health services than a way to avoid individual contagion (the so called “flatten the curve” project).

Open government data has long promised to allow citizens and businesses to build new services at scale and make government accountable. The pandemic largely confirmed how important this data could be to allow citizens to analyse things independently. Hundreds of analysts from all walks of life and disciplines used social media to discuss their analysis and predictions, many becoming household names and go-to people in countries and regions. Yes, this led to noise and a so-called “infodemic,” but overall it served as a fundamental tool to increase confidence and consensus behind the policy measures and to make governments accountable for their actions. For instance, one Catalan analyst demonstrated that vaccines were not provided during weekends and forced the government to change its stance. Yet it is also clear that not all went well, most notably on the supply side. Governments published data of low quality, either in PDF, with delays or with missing data due to spreadsheet abuse.

In most cases, there was little demand for sophisticated data publishing solutions such as “linked” or “FAIR” data, although particularly significant was the uptake of these kinds of solutions when it came time to share crucial research data. Experts argue that the trend towards open science has accelerated dramatically and irreversibly in the last year, as shown by the portal https://www.covid19dataportal.org/ which allowed sharing of high quality data for scientific research….

But other new policy tools proved less easy to use and ultimately ineffective. Collaborative governance, for one, promised to leverage the knowledge of thousands of citizens to improve public policies and services. In practice, methodologies aiming at involving citizens in decision making and service design were of little use. Decisions related to lockdown and opening up were taken in closed committees in top down mode. Individual exceptions certainly exist: Milan, one of the cities worst hit by the pandemic, launched a co-created strategy for opening up after the lockdown, receiving almost 3000 contributions to the consultation. But overall, such initiatives had limited impact and visibility. With regard to co-design of public services, in times of emergency there was no time for prototyping or focus groups. Services such as emergency financial relief had to be launched in a hurry and “just work.”

Citizen science promised to make every citizen a consensual data source for monitoring complex phenomena in real time through apps and Internet-of-Things sensors. In the pandemic, there were initially great expectations on digital contact tracing apps to allow for real time monitoring of contagions, most notably through bluetooth connections in the phone. However, they were mostly a disappointment. Citizens were reluctant to install them. And contact tracing soon appeared to be much more complicated – and human intensive – than originally thought. The huge debate between technology and privacy was followed by very limited impact. Much ado about nothing.

Behavioural economics (commonly known as nudge theory) is probably the most visible failure of the pandemic. It promised to move beyond traditional carrots (public funding) and sticks (regulation) in delivering policy objectives by adopting an experimental method to influence or “nudge” human behaviour towards desired outcomes. The reality is that soft nudges proved an ineffective alternative to hard lockdown choices. What makes it uniquely negative is that such methods took centre stage in the initial phase of the pandemic and particularly informed the United Kingdom’s lax approach in the first months on the basis of a hypothetical and unproven “behavioural fatigue.” This attracted heavy criticism towards the excessive reliance on nudges by the United Kingdom government, a legacy of Prime Minister David Cameron’s administration. The origin of such criticisms seems to lie not in the method shortcomings per se, which enjoyed success previously on more specific cases, but in the backlash from excessive expectations and promises, epitomised in the quote of a prominent behavioural economist: “It’s no longer a matter of supposition as it was in 2010 […] we can now say with a high degree of confidence these models give you best policy.

Three factors emerge as the key determinants behind success and failure: maturity, institutions and leadership….(More)”.

Radical Secrecy: The Ends of Transparency in Datafied America


Book by Clare Birchall: “When total data surveillance delimits agency and revelations of political wrongdoing fail to have consequences, is transparency the social panacea liberal democracies purport it to be? This book sets forth the provocative argument that progressive social goals would be better served by a radical form of secrecy, at least while state and corporate forces hold an asymmetrical advantage over the less powerful in data control. Clare Birchall asks: How might transparency actually serve agendas that are far from transparent? Can we imagine a secrecy that could act in the service of, rather than against, a progressive politics?

To move beyond atomizing calls for privacy and to interrupt the perennial tension between state security and the public’s right to know, Birchall adapts Édouard Glissant’s thinking to propose a digital “right to opacity.” As a crucial element of radical secrecy, she argues, this would eventually give rise to a “postsecret” society, offering an understanding and experience of the political that is free from the false choice between secrecy and transparency. She grounds her arresting story in case studies including the varied presidential styles of George W. Bush, Barack Obama, and Donald Trump; the Snowden revelations; conspiracy theories espoused or endorsed by Trump; WikiLeaks and guerrilla transparency; and the opening of the state through data portals.

Postsecrecy is the necessary condition for imagining, finally, an alternative vision of “the good,” of equality, as neither shaped by neoliberal incarnations of transparency nor undermined by secret state surveillance. Not least, postsecrecy reimagines collective resistance in the era of digital data….(More)”.

How Digital Trust Varies Around the World


Bhaskar Chakravorti, Ajay Bhalla, and Ravi Shankar Chaturvedi at Harvard Business Review: “As economies around the world digitalize rapidly in response to the pandemic, one component that can sometimes get left behind is user trust. What does it take to build out a digital ecosystem that users will feel comfortable actually using? To answer this question, the authors explored four components of digital trust: the security of an economy’s digital environment; the quality of the digital user experience; the extent to which users report trust in their digital environment; and the extent to which users actually use the digital tools available to them. They then used almost 200 indicators to rank 42 global economies on their performance in each of these four metrics, finding a number of interesting trends around how different economies have developed mechanisms for engendering trust, as well as how different types of trust do — or don’t — correspond to other digital development metrics…(More)”.

Far-right news sources on Facebook more engaging


Study by Laura Edelson, Minh-Kha Nguyen, Ian Goldstein, Oana Goga, Tobias Lauinger, and Damon McCoy: Facebook has become a major way people find news and information in an increasingly politically polarized nation. We analyzed how users interacted with different types of posts promoted as news in the lead-up to and aftermath of the U.S. 2020 elections. We found that politically extreme sources tend to generate more interactions from users. In particular, content from sources rated as far-right by independent news rating services consistently received the highest engagement per follower of any partisan group. Additionally, frequent purveyors of far-right misinformation had on average 65% more engagement per follower than other far-right pages. We found:

  • Sources of news and information rated as far-right generate the highest average number of interactions per follower with their posts, followed by sources from the far-left, and then news sources closer to the center of the political spectrum.
  • Looking at the far-right, misinformation sources far outperform non-misinformation sources. Far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week, while non-misinformation sources had an average of 259 weekly interactions per thousand followers.
  • Engagement with posts from far-right and far-left news sources peaked around Election Day and again on January 6, the day of the certification of the electoral count and the U.S. Capitol riot. For posts from all other political leanings of news sources, the increase in engagement was much less intense.
  • Center and left partisan categories incur a misinformation penalty, while right-leaning sources do not. Center sources of misinformation, for example, performed about 70% worse than their non-misinformation counterparts. (Note: center sources of misinformation tend to be sites presenting as health news that have no obvious ideological orientation.)…(More)”.

Europe’s Digital Decade: Commission sets the course towards a digitally empowered Europe by 2030


European Commission Press Release: “…The Commission proposes a Digital Compass to translate the EUʼs digital ambitions for 2030 into concrete terms. They evolve around four cardinal points:

1) Digitally skilled citizens and highly skilled digital professionals; By 2030, at least 80% of all adults should have basic digital skills, and there should be 20 million employed ICT specialists in the EU – while more women should take up such jobs;

2) Secure, performant and sustainable digital infrastructures; By 2030, all EU households should have gigabit connectivity and all populated areas should be covered by 5G; the production of cutting-edge and sustainable semiconductors in Europe should be 20% of world production; 10,000 climate neutral highly secure edge nodes should be deployed in the EU; and Europe should have its first quantum computer;

3) Digital transformation of businesses; By 2030, three out of four companies should use cloud computing services, big data and Artificial Intelligence; more than 90% SMEs should reach at least basic level of digital intensity; and the number of EU unicorns should double;

4) Digitalisation of public services; By 2030, all key public services should be available online; all citizens will have access to their e-medical records; and 80% citizens should use an eID solution.

The Compass sets out a robust joint governance structure with Member States based on a monitoring system with annual reporting in the form of traffic lights. The targets will be enshrined in a Policy Programme to be agreed with the European Parliament and the Council….(More)“.