Eight (No, Nine!) Problems With Big Data


Gary Marcus and Ernest Davis in the New York Times: “BIG data is suddenly everywhere. Everyone seems to be collecting it, analyzing it, making money from it and celebrating (or fearing) its powers. Whether we’re talking about analyzing zillions of Google search queries to predict flu outbreaks, or zillions of phone records to detect signs of terrorist activity, or zillions of airline stats to find the best time to buy plane tickets, big data is on the case. By combining the power of modern computing with the plentiful data of the digital era, it promises to solve virtually any problem — crime, public health, the evolution of grammar, the perils of dating — just by crunching the numbers.

Or so its champions allege. “In the next two decades,” the journalist Patrick Tucker writes in the latest big data manifesto, “The Naked Future,” “we will be able to predict huge areas of the future with far greater accuracy than ever before in human history, including events long thought to be beyond the realm of human inference.” Statistical correlations have never sounded so good.

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. “Jeopardy!” champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

Third, many tools that are based on big data can be easily gamed. For example, big data programs for grading student essays often rely on measures like sentence length and word sophistication, which are found to correlate well with the scores given by human graders. But once students figure out how such a program works, they start writing long sentences and using obscure words, rather than learning how to actually formulate and write clear, coherent text. Even Google’s celebrated search engine, rightly seen as a big data success story, is not immune to “Google bombing” and “spamdexing,” wily techniques for artificially elevating website search placement.

Fourth, even when the results of a big data analysis aren’t intentionally gamed, they often turn out to be less robust than they initially seem. Consider Google Flu Trends, once the poster child for big data. In 2009, Google reported — to considerable fanfare — that by analyzing flu-related search queries, it had been able to detect the spread of the flu as accurately and more quickly than the Centers for Disease Control and Prevention. A few years later, though, Google Flu Trends began to falter; for the last two years it has made more bad predictions than good ones.

As a recent article in the journal Science explained, one major contributing cause of the failures of Google Flu Trends may have been that the Google search engine itself constantly changes, such that patterns in data collected at one time do not necessarily apply to data collected at another time. As the statistician Kaiser Fung has noted, collections of big data that rely on web hits often merge data that was collected in different ways and with different purposes — sometimes to ill effect. It can be risky to draw conclusions from data sets of this kind.

A fifth concern might be called the echo-chamber effect, which also stems from the fact that much of big data comes from the web. Whenever the source of information for a big data analysis is itself a product of big data, opportunities for vicious cycles abound. Consider translation programs like Google Translate, which draw on many pairs of parallel texts from different languages — for example, the same Wikipedia entry in two different languages — to discern the patterns of translation between those languages. This is a perfectly reasonable strategy, except for the fact that with some of the less common languages, many of the Wikipedia articles themselves may have been written using Google Translate. In those cases, any initial errors in Google Translate infect Wikipedia, which is fed back into Google Translate, reinforcing the error.

A sixth worry is the risk of too many correlations. If you look 100 times for correlations between two variables, you risk finding, purely by chance, about five bogus correlations that appear statistically significant — even though there is no actual meaningful connection between the variables. Absent careful supervision, the magnitudes of big data can greatly amplify such errors.

Seventh, big data is prone to giving scientific-sounding solutions to hopelessly imprecise questions. In the past few months, for instance, there have been two separate attempts to rank people in terms of their “historical importance” or “cultural contributions,” based on data drawn from Wikipedia. One is the book “Who’s Bigger? Where Historical Figures Really Rank,” by the computer scientist Steven Skiena and the engineer Charles Ward. The other is an M.I.T. Media Lab project called Pantheon.

Both efforts get many things right — Jesus, Lincoln and Shakespeare were surely important people — but both also make some egregious errors. “Who’s Bigger?” claims that Francis Scott Key was the 19th most important poet in history; Pantheon has claimed that Nostradamus was the 20th most important writer in history, well ahead of Jane Austen (78th) and George Eliot (380th). Worse, both projects suggest a misleading degree of scientific precision with evaluations that are inherently vague, or even meaningless. Big data can reduce anything to a single number, but you shouldn’t be fooled by the appearance of exactitude.

FINALLY, big data is at its best when analyzing things that are extremely common, but often falls short when analyzing things that are less common. For instance, programs that use big data to deal with text, such as search engines and translation programs, often rely heavily on something called trigrams: sequences of three words in a row (like “in a row”). Reliable statistical information can be compiled about common trigrams, precisely because they appear frequently. But no existing body of data will ever be large enough to include all the trigrams that people might use, because of the continuing inventiveness of language.

To select an example more or less at random, a book review that the actor Rob Lowe recently wrote for this newspaper contained nine trigrams such as “dumbed-down escapist fare” that had never before appeared anywhere in all the petabytes of text indexed by Google. To witness the limitations that big data can have with novelty, Google-translate “dumbed-down escapist fare” into German and then back into English: out comes the incoherent “scaled-flight fare.” That is a long way from what Mr. Lowe intended — and from big data’s aspirations for translation.

Wait, we almost forgot one last problem: the hype….

This War of Mine – The Ultimate Serious Game


The Escapist Magazine “…there are not many games about the effect of war. Paweł Miechowski thinks that needs to be changed, and he’s doing it with a little game called This War of Mine from the Polish outfit 11 Bit Studio.
“We’re in the moment where we want to talk about important things via games,” Miechowski said. “We are used to the fact that important topics are covered by music, novels, movies, while games mostly about fun. Laughing ‘ha ha ha’ fun.”
In fact, he believes games are well-suited for showing harsh truths and realities, not by ham-fistedly repeating political phrases or mantras, but by allowing you to draw your own conclusions from the circumstances. “Games are perfect for this because they are interactive. Novels or movies are not,” he said. “Games can take you through the experience through your hands, by your eyes. You are not a spectator. You are part of the experience.”
What is the experience of This War of Mine then? 11 Bit Studios was inspired by the firsthand accounts of people who tried to survive within a modern city that had no law, no order or infrastructure due to an ongoing war between militaries. “Everything we did in this game, we did after extensive research. Any mechanics in the game are just a translation of our knowledge of situations in recent history,” he said. “Yugoslavia, Syria, Serbia. Anywhere civilians survived within a besieged city after war. They were all pretty similar, struggling for water, hygiene items, food, simple tools to make something, wood to heat the house up.”
Miechowski showed me an early build of This War of Mine and that’s exactly what it is. Your only goal, which is emblazoned on the screen when you start the game, is to “Survive for 30 days.” You begin inside a 2D representation of a bombed-out building with several floors. You have a few allies with names like Boris or Yvette, each of whom have traits such as “good cook” or “strong, but slow.” Orders can be given to your team, such as to build a bed or to scavenge the piles of junk within your stronghold for any useful items. You usually start out with nothing, but over time you’ll accumulate all sorts of items and materials. The game is in real time, the hours slowly tick by, but once you assign tasks it can be useful to advance the timeline by clicking the “Start Night” button.”

Expanding Opportunity through Open Educational Resources


Hal Plotkin and Colleen Chien at the White House: “Using advanced technology to dramatically expand the quality and reach of education has long been a key priority for the Obama Administration.
In December 2013, the President’s Council of Advisors on Science and Technology (PCAST) issued a report exploring the potential of Massive Open Online Courses (MOOCs) to expand access to higher education opportunities. Last month, the President announced a $2B down payment, and another $750M in private-sector commitments to deliver on the President’s ConnectEd initiative, which will connect 99% of American K-12 students to broadband by 2017 at no cost to American taxpayers.
This week, we are happy to be joining with educators, students, and technologists worldwide to recognize and celebrate Open Education Week.
Open Educational Resources (“OER”) are educational resources that are released with copyright licenses allowing for their free use, continuous improvement, and modification by others. The world is moving fast, and OER enables educators and students to access, customize, and remix high-quality course materials reflecting the latest understanding of the world and materials that incorporate state of the art teaching methods – adding their own insights along the way. OER is not a silver bullet solution to the many challenges that teachers, students and schools face. But it is a tool increasingly being used, for example by players like edX and the Kahn Academy, to improve learning outcomes and create scalable platforms for sharing educational resources that reach millions of students worldwide.
Launched at MIT in 2001, OER became a global movement in 2007 when thousands of educators around the globe endorsed the Cape Town Declaration on Open Educational Resources. Another major milestone came in 2011, when Secretary of Education Arne Duncan and then-Secretary of Labor Hilda Solis unveiled the four-year, $2B Trade Adjustment Assistance Community College and Career Training Grant Program (TAACCCT). It was the first Federal program to leverage OER to support the development of a new generation of affordable, post-secondary educational programs that can be completed in two years or less to prepare students for careers in emerging and expanding industries….
Building on this record of success, OSTP and the U.S. Agency for International Development (USAID) are exploring an effort to inspire and empower university students through multidisciplinary OER focused on one of the USAID Grand Challenges, such as securing clean water, saving lives at birth, or improving green agriculture. This effort promises to  be a stepping stone towards leveraging OER to help solve other grand challenges such as the NAE Grand Challenges in Engineering or Grand Challenges in Global Health.
This is great progress, but there is more work to do. We look forward to keeping the community updated right here. To see the winning videos from the U.S. Department of Education’s “Why Open Education Matters” Video Contest, click here.”

Big Data, Big New Businesses


Nigel Shaboldt and Michael Chui: “Many people have long believed that if government and the private sector agreed to share their data more freely, and allow it to be processed using the right analytics, previously unimaginable solutions to countless social, economic, and commercial problems would emerge. They may have no idea how right they are.

Even the most vocal proponents of open data appear to have underestimated how many profitable ideas and businesses stand to be created. More than 40 governments worldwide have committed to opening up their electronic data – including weather records, crime statistics, transport information, and much more – to businesses, consumers, and the general public. The McKinsey Global Institute estimates that the annual value of open data in education, transportation, consumer products, electricity, oil and gas, health care, and consumer finance could reach $3 trillion.

These benefits come in the form of new and better goods and services, as well as efficiency savings for businesses, consumers, and citizens. The range is vast. For example, drawing on data from various government agencies, the Climate Corporation (recently bought for $1 billion) has taken 30 years of weather data, 60 years of data on crop yields, and 14 terabytes of information on soil types to create customized insurance products.

Similarly, real-time traffic and transit information can be accessed on smartphone apps to inform users when the next bus is coming or how to avoid traffic congestion. And, by analyzing online comments about their products, manufacturers can identify which features consumers are most willing to pay for, and develop their business and investment strategies accordingly.

Opportunities are everywhere. A raft of open-data start-ups are now being incubated at the London-based Open Data Institute (ODI), which focuses on improving our understanding of corporate ownership, health-care delivery, energy, finance, transport, and many other areas of public interest.

Consumers are the main beneficiaries, especially in the household-goods market. It is estimated that consumers making better-informed buying decisions across sectors could capture an estimated $1.1 trillion in value annually. Third-party data aggregators are already allowing customers to compare prices across online and brick-and-mortar shops. Many also permit customers to compare quality ratings, safety data (drawn, for example, from official injury reports), information about the provenance of food, and producers’ environmental and labor practices.

Consider the book industry. Bookstores once regarded their inventory as a trade secret. Customers, competitors, and even suppliers seldom knew what stock bookstores held. Nowadays, by contrast, bookstores not only report what stock they carry but also when customers’ orders will arrive. If they did not, they would be excluded from the product-aggregation sites that have come to determine so many buying decisions.

The health-care sector is a prime target for achieving new efficiencies. By sharing the treatment data of a large patient population, for example, care providers can better identify practices that could save $180 billion annually.

The Open Data Institute-backed start-up Mastodon C uses open data on doctors’ prescriptions to differentiate among expensive patent medicines and cheaper “off-patent” varieties; when applied to just one class of drug, that could save around $400 million in one year for the British National Health Service. Meanwhile, open data on acquired infections in British hospitals has led to the publication of hospital-performance tables, a major factor in the 85% drop in reported infections.

There are also opportunities to prevent lifestyle-related diseases and improve treatment by enabling patients to compare their own data with aggregated data on similar patients. This has been shown to motivate patients to improve their diet, exercise more often, and take their medicines regularly. Similarly, letting people compare their energy use with that of their peers could prompt them to save hundreds of billions of dollars in electricity costs each year, to say nothing of reducing carbon emissions.

Such benchmarking is even more valuable for businesses seeking to improve their operational efficiency. The oil and gas industry, for example, could save $450 billion annually by sharing anonymized and aggregated data on the management of upstream and downstream facilities.

Finally, the move toward open data serves a variety of socially desirable ends, ranging from the reuse of publicly funded research to support work on poverty, inclusion, or discrimination, to the disclosure by corporations such as Nike of their supply-chain data and environmental impact.

There are, of course, challenges arising from the proliferation and systematic use of open data. Companies fear for their intellectual property; ordinary citizens worry about how their private information might be used and abused. Last year, Telefónica, the world’s fifth-largest mobile-network provider, tried to allay such fears by launching a digital confidence program to reassure customers that innovations in transparency would be implemented responsibly and without compromising users’ personal information.

The sensitive handling of these issues will be essential if we are to reap the potential $3 trillion in value that usage of open data could deliver each year. Consumers, policymakers, and companies must work together, not just to agree on common standards of analysis, but also to set the ground rules for the protection of privacy and property.”

Innovating for the Global South: New book offers practical insights


Press Release: “Despite the vast wealth generated in the last half century, in today’s world inequality is worsening and poverty is becoming increasingly chronic. Hundreds of millions of people continue to live on less than $2 per day and lack basic human necessities such as nutritious food, shelter, clean water, primary health care, and education.
Innovating for the Global South: Towards an Inclusive Innovation Agenda, the latest book from Rotman-UTP Publishing and the first volume in the Munk Series on Global Affairs, offers fresh solutions for reducing poverty in the developing world. Highlighting the multidisciplinary expertise of the University of Toronto’s Global Innovation Group, leading experts from the fields of engineering, public health, medicine, management, and public policy examine the causes and consequences of endemic poverty and the challenges of mitigating its effects from the perspective of the world’s poorest of the poor.
Can we imagine ways to generate solar energy to run essential medical equipment in the countryside? Can we adapt information and communication technologies to provide up-to-the-minute agricultural market prices for remote farming villages? How do we create more inclusive innovation processes to hear the voices of those living in urban slums? Is it possible to reinvent a low-cost toilet that operates beyond the water and electricity grids?
Motivated by the imperatives of developing, delivering, and harnessing innovation in the developing world, Innovating for the Global South is essential reading for managers, practitioners, and scholars of development, business, and policy.
“As we see it, Innovating for the Global South is fundamentally about innovating scalable solutions that mitigate the effects of poverty and underdevelopment in the Global South. It is not about inventing some new gizmo for some untapped market in the developing world,” say Profs. Dilip Soman and Joseph Wong of the UofT, who are two of the editors of the volume.
The book is edited and also features contributions by three leading UofT thinkers who are tackling innovation in the global south from three different academic perspectives.

  • Dilip Soman is Corus Chair in Communication Strategy and a professor of Marketing at the Rotman School of Management.
  • Janice Gross Stein is the Belzberg Professor of Conflict Management in the Department of Political Science and Director of the Munk School of Global Affairs.
  • Joseph Wong is Ralph and Roz Halbert Professor of Innovation at the Munk School of Global Affairs and Canada Research Chair in Democratization, Health, and Development in the Department of Political Science.

The chapters in the book address the process of innovation from a number of vantage points.
Introduction: Rethinking Innovation – Joseph Wong and Dilip Soman
Chapter 1: Poverty, Invisibility, and Innovation – Joseph Wong
Chapter 2: Behaviourally Informed Innovation – Dilip Soman
Chapter 3: Appropriate Technologies for the Global South – Yu-Ling Cheng (University of Toronto, Chemical Engineering and Applied Chemistry) and Beverly Bradley (University of Toronto, Centre for Global Engineering)
Chapter 4: Globalization of Biopharmaceutical Innovation: Implications for Poor-Market Diseases – Rahim Rezaie (University of Toronto, Munk School of Global Affairs, Research Fellow)
Chapter 5: Embedded Innovation in Health – Anita M. McGahan (University of Toronto, Rotman School of Management, Associate Dean of Research), Rahim Rezaie and Donald C. Cole (University of Toronto, Dalla Lana School of Public Health)
Chapter 6: Scaling Up: The Case of Nutritional Interventions in the Global South – Ashley Aimone Phillips (Registered Dietitian), Nandita Perumal (University of Toronto, Doctoral Fellow, Epidemiology), Carmen Ho (University of Toronto, Doctoral Fellow, Political Science), and Stanley Zlotkin (University of Toronto and the Hospital for Sick Children,Paediatrics, Public Health Sciences and Nutritional Sciences)
Chapter 7: New Models for Financing Innovative Technologies and Entrepreneurial Organizations in the Global South – Murray R. Metcalfe (University of Toronto, Centre for Global Engineering, Globalization)
Chapter 8: Innovation and Foreign Policy – Janice Gross Stein
Conclusion: Inclusive Innovation – Will Mitchell (University of Toronto, Rotman School of Management, Strategic Management), Anita M. McGahan”

Index: Designing for Behavior Change


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on designing for behavior change and was originally published in 2014.

  • Year the Behavioural Insights or “Nudge” Team was established by David Cameron in the U.K.: 2010
  • Amount saved by the U.K. Courts Service a year by sending people owing fines personalized text messages to persuade them to pay promptly since the creation of the Nudge unit: £30m
    • Entire budget for the Behavioural Insights Team: less than £1 million
    • Estimated reduction in bailiff interventions through the use of personalized text reminders: 150,000 fewer interventions annually
  • Percentage increase among British residents who paid their taxes on time when they received a letter saying that most citizens in their neighborhood pay their taxes on time: 15%
  • Estimated increase in organ-donor registrations in the U.K. if people are asked “If you needed an organ transplant, would you take one?”: 96,000
  • Proportion of employees who now have a workplace pension since the U.K. government switched from opt-in to opt-out (illustrating the power of defaults): 83%, 63% before opt-out
  • Increase in 401(k) enrollment rates within the U.S. by changing the default from ‘opt in’ to ‘opt out’: from 13% to 80%
  • Behavioral studies have shown that consumers overestimate savings from credit cards with no annual fees. Reduction in overall borrowing costs to consumers by requiring card issuers to tell consumers how much it would cost them in fees and interest, under the 2009 CARD Act in the U.S.: 1.7% of average daily balances 
  • Many high school students and their families in the U.S. find financial aid forms for college complex and thus delay filling them out. Increase in college enrollment as a result of being helped to complete the FAFSA financial aid form by an H&R tax professional, who then provided immediate estimates of the amount of aid the student was eligible for, and the net tuition cost of four nearby public colleges: 26%
  • How much more likely people are to keep accounting records, calculate monthly revenues, and separate their home and business books if given “rules of thumb”-based training with regards to managing their finances, according to a randomized control trial conducted in a bank in the Dominican Republic: 10%
  • Elderly Americans are asked to choose from over 40 options when enrolling in Medicaid Part D private drug plans. How many switched plans to save money when they received a letter providing information about three plans that would be cheaper for them: almost double 
    • The amount saved on average per person by switching plans due to this intervention: $150 per year
  • Increase in prescriptions to manage cardiac disease when Medicaid enrollees are sent a suite of behavioral nudges such as more salient description of the consequences of remaining untreated and post-it note reminders during an experiment in the U.S.: 78%
  • Reduction in street-litter when a trail of green footprints leading to nearby garbage cans is stenciled on the ground during an experiment in Copenhagen, Denmark: 46%
  • Reduction in missed National Health Service appointments in the U.K. when patients are asked to fill out their own appointment cards: 18%
    • Reduction in missed appointments when patients are also made aware of the number of people who attend their appointments on time: 31%
    • The cost of non-attendance per year for the National Health Service: £700m 
  • How many people in a U.S. experiment chose to ‘downsize’ their meals when asked, regardless of whether they received a discount for the smaller portion: 14-33%
    • Average reduction in calories as a result of downsizing: 200
  • Number of households in the U.K. without properly insulated attics, leading to high energy consumption and bills: 40%
    • Result of offering group discounts to motivate households to insulate their attics: no effect
    • Increase in households that agreed to insulate their attics when offered loft-clearing services even though they had to pay for the service: 4.8 fold increase

Sources

Selected Readings on Behavioral Economics: Nudges


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of behavioral economics was originally published in 2014.

The 2008 publication of Richard Thaler and Cass Sunstein’s Nudge ushered in a new era of behavioral economics, and since then, policy makers in the United States and elsewhere have been applying behavioral economics to the field of public policy. Like Smart Disclosure, behavioral economics can be used in the public sector to improve the decisionmaking ability of citizens without relying on regulatory interventions. In the six years since Nudge was published, the United Kingdom has created the Behavioural Insights Team (also known as the Nudge Unit), a cross-ministerial organization that uses behavioral economics to inform public policy, and the White House has recently followed suit by convening a team of behavioral economists to create a behavioral insights-driven team in the United States. Policymakers have been using behavioral insights to design more effective interventions in the fields of long term unemployment; roadway safety; enrollment in retirement plans; and increasing enrollment in organ donation registries, to name some noteworthy examples. The literature of this nascent field provides a look at the growing optimism in the potential of applying behavioral insights in the public sector to improve people’s lives.

Selected Reading List (in alphabetical order)

  • John Beshears, James Choi, David Laibson and Brigitte C. Madrian – The Importance of Default Options for Retirement Savings Outcomes: Evidence from the United States – a paper examining the role default options play in encouraging intelligent retirement savings decisionmaking.
  • Cabinet Office and Behavioural Insights Team, United Kingdom – Applying Behavioural Insights to Healtha paper outlining some examples of behavioral economics being applied to the healthcare landscape using cost-efficient interventions.
  • Matthew Darling, Saugato Datta and Sendhil Mullainathan – The Nature of the BEast: What Behavioral Economics Is Not – a paper discussing why control and behavioral economics are not as closely aligned as some think, reiterating the fact that the field is politically agnostic.
  • Antoinette Schoar and Saugato Datta – The Power of Heuristics – a paper exploring the concept of “heuristics,” or rules of thumb, which can provide helpful guidelines for pushing people toward making “reasonably good” decisions without a full understanding of the complexity of a situation.
  • Richard H. Thaler and Cass R. Sunstein – Nudge: Improving Decisions About Health, Wealth, and Happiness – an influential book describing the many ways in which the principles of behavioral economics can be and have been used to influence choices and behavior through the development of new “choice architectures.” 
  • U.K. Parliament Science and Technology Committee – Behaviour Changean exploration of the government’s attempts to influence the behaviour of its citizens through nudges, with a focus on comparing the effectiveness of nudges to that of regulatory interventions.

Annotated Selected Reading List (in alphabetical order)

Beshears, John, James Choi, David Laibson and Brigitte C. Madrian. “The Importance of Default Options for Retirement Savings Outcomes: Evidence from the United States.” In Jeffrey R. Brown, Jeffrey B. Liebman and David A. Wise, editors, Social Security Policy in a Changing Environment, Cambridge: National Bureau of Economic Research, 2009. http://bit.ly/LFmC5s.

  • This paper examines the role default options play in pushing people toward making intelligent decisions regarding long-term savings and retirement planning.
  • Importantly, the authors provide evidence that a strategically oriented default setting from the outset is likely not enough to fully nudge people toward the best possible decisions in retirement savings. They find that the default settings in every major dimension of the savings process (from deciding whether to participate in a 401(k) to how to withdraw money at retirement) have real and distinct effects on behavior.

Cabinet Office and Behavioural Insights Team, United Kingdom. “Applying Behavioural Insights to Health.” December 2010. http://bit.ly/1eFP16J.

  • In this report, the United Kingdom’s Behavioural Insights Team does not attempt to “suggest that behaviour change techniques are the silver bullet that can solve every problem.” Rather, they explore a variety of examples where local authorities, charities, government and the private-sector are using behavioural interventions to encourage healthier behaviors.  
  • The report features case studies regarding behavioral insights ability to affect the following public health issues:
    • Smoking
    • Organ donation
    • Teenage pregnancy
    • Alcohol
    • Diet and weight
    • Diabetes
    • Food hygiene
    • Physical activity
    • Social care
  • The report concludes with a call for more experimentation and knowledge gathering to determine when, where and how behavioural interventions can be most effective in helping the public become healthier.

Darling, Matthew, Saugato Datta and Sendhil Mullainathan. “The Nature of the BEast: What Behavioral Economics Is Not.” The Center for Global Development. October 2013. https://bit.ly/2QytRmf.

  • In this paper, Darling, Datta and Mullainathan outline the three most pervasive myths that abound within the literature about behavioral economics:
    • First, they dispel the relationship between control and behavioral economics.  Although tools used within behavioral economics can convince people to make certain choices, the goal is to nudge people to make the choices they want to make. For example, studies find that when retirement savings plans change the default to opt-in rather than opt-out, more workers set up 401K plans. This is an example of a nudge that guides people to make a choice that they already intend to make.
    • Second, they reiterate that the field is politically agnostic. Both liberals and conservatives have adopted behavioral economics and its approach is neither liberal nor conservative. President Obama embraces behavioral economics but the United Kingdom’s conservative party does, too.
    • And thirdly, the article highlights that irrationality actually has little to do with behavioral economics. Context is an important consideration when one considers what behavior is rational and what behavior is not. Rather than use the term “irrational” to describe human beings, the authors assert that humans are “infinitely complex” and behavior that is often considered irrational is entirely situational.

Schoar, Antoinette and Saugato Datta. “The Power of Heuristics.” Ideas42. January 2014. https://bit.ly/2UDC5YK.

  • This paper explores the notion that being presented with a bevy of options can be desirable in many situations, but when making an intelligent decision requires a high-level understanding of the nuances of vastly different financial aid packages, for example, options can overwhelm. Heuristics (rules of thumb) provide helpful guidelines that “enable people to make ‘reasonably good’ decisions without needing to understand all the complex nuances of the situation.”
  • The underlying goal heuristics in the policy space involves giving people the type of “rules of thumb” that enable make good decisionmaking regarding complex topics such as finance, healthcare and education. The authors point to the benefit of asking individuals to remember smaller pieces of knowledge by referencing a series of studies conducted by psychologists Beatty and Kahneman that showed people were better able to remember long strings of numbers when they were broken into smaller segments.
  • Schoar and Datta recommend these four rules when implementing heuristics:
    • Use heuristics where possible, particularly in complex situation;
    • Leverage new technology (such as text messages and Internet-based tools) to implement heuristics.
    • Determine where heuristics can be used in adult training programs and replace in-depth training programs with heuristics where possible; and
    • Consider how to apply heuristics in situations where the exception is the rule. The authors point to the example of savings and credit card debt. In most instances, saving a portion of one’s income is a good rule of thumb. However, when one has high credit card debt, paying off debt could be preferable to building one’s savings.

Thaler, Richard H. and Cass R. Sunstein. Nudge: Improving Decisions About Health, Wealth, and Happiness. Yale University Press, 2008. https://bit.ly/2kNXroe.

  • This book, likely the single piece of scholarship most responsible for bringing the concept of nudges into the public consciousness, explores how a strategic “choice architecture” can help people make the best decisions.
  • Thaler and Sunstein, while advocating for the wider and more targeted use of nudges to help improve people’s lives without resorting to overly paternal regulation, look to five common nudges for lessons and inspiration:
    • The design of menus gets you to eat (and spend) more;
    • “Flies” in urinals improve, well, aim;
    • Credit card minimum payments affect repayment schedules;
    • Automatic savings programs increase savings rate; and
    • “Defaults” can improve rates of organ donation.
  • In the simplest terms, the authors propose the wider deployment of choice architectures that follow “the golden rule of libertarian paternalism: offer nudges that are most likely to help and least likely to inflict harm.”

U.K. Parliament Science and Technology Committee. “Behaviour Change.” July 2011. http://bit.ly/1cbYv5j.

  • This report from the U.K.’s Science and Technology Committee explores the government’s attempts to influence the behavior of its citizens through nudges, with a focus on comparing the effectiveness of nudges to that of regulatory interventions.
  • The author’s central conclusion is that, “non-regulatory measures used in isolation, including ‘nudges,’ are less likely to be effective. Effective policies often use a range of interventions.”
  • The report’s other major findings and recommendations are:
    • Government must invest in gathering more evidence about what measures work to influence population behaviour change;
    • They should appoint an independent Chief Social Scientist to provide them with robust and independent scientific advice;
    • The Government should take steps to implement a traffic light system of nutritional labelling on all food packaging; and
    • Current voluntary agreements with businesses in relation to public health have major failings. They are not a proportionate response to the scale of the problem of obesity and do not reflect the evidence about what will work to reduce obesity. If effective agreements cannot be reached, or if they show minimal benefit, the Government should pursue regulation.”

Nudging News Producers and Consumers Toward More Thoughtful, Less Polarized Discourse


New paper by Darrel West and Beth Stone from Brookings: “At a time of extraordinary domestic and international policy challenges, Americans need high-quality news.  Readers and viewers must decipher the policy options that the country faces and the manner in which various decisions affect them personally.  It often is not readily apparent how to assess complicated policy choices and what the best steps are for moving forward.

Having poor quality news coverage is especially problematic when the political process is sharply polarized.  As has been documented by political scientists Tom Mann and Norman Ornstein, the United States has a Congress today where the most conservative Democrat is to the left of the most moderate Republican. [1]  There are many reasons for this spike in polarization, but there is little doubt that the news media amplify and exacerbate social and political divisions.
Too often, journalists follow a “Noah’s Ark” approach to coverage in which a strong liberal is paired with a vocal conservative in an ideological food fight.  The result is polarization of discourse and “false equivalence” in reporting. This lack of nuanced analysis confuses viewers and makes it difficult for them to sort out the contrasting facts and opinions.  People get the sense that there are only two policy options and that there are few gradations or complexities in the positions that are reported.
In this paper, West and Stone review challenges facing the news media in an age of political polarization.  This includes hyper-competitiveness in news coverage, a dramatic decline in local journalism and resulting nationalization of the news, and the personalization of coverage.  After discussing these problems and how they harm current reporting, they present several ideas for nudging news producers and consumers towards more thoughtful and less polarizing responses.”

The Age of ‘Infopolitics’


Colin Koopman in the New York Times: “We are in the midst of a flood of alarming revelations about information sweeps conducted by government agencies and private corporations concerning the activities and habits of ordinary Americans. After the initial alarm that accompanies every leak and news report, many of us retreat to the status quo, quieting ourselves with the thought that these new surveillance strategies are not all that sinister, especially if, as we like to say, we have nothing to hide.
One reason for our complacency is that we lack the intellectual framework to grasp the new kinds of political injustices characteristic of today’s information society. Everyone understands what is wrong with a government’s depriving its citizens of freedom of assembly or liberty of conscience. Everyone (or most everyone) understands the injustice of government-sanctioned racial profiling or policies that produce economic inequality along color lines. But though nearly all of us have a vague sense that something is wrong with the new regimes of data surveillance, it is difficult for us to specify exactly what is happening and why it raises serious concern, let alone what we might do about it.
Our confusion is a sign that we need a new way of thinking about our informational milieu. What we need is a concept of infopolitics that would help us understand the increasingly dense ties between politics and information. Infopolitics encompasses not only traditional state surveillance and data surveillance, but also “data analytics” (the techniques that enable marketers at companies like Target to detect, for instance, if you are pregnant), digital rights movements (promoted by organizations like the Electronic Frontier Foundation), online-only crypto-currencies (like Bitcoin or Litecoin), algorithmic finance (like automated micro-trading) and digital property disputes (from peer-to-peer file sharing to property claims in the virtual world of Second Life). These are only the tip of an enormous iceberg that is drifting we know not where.
Surveying this iceberg is crucial because atop it sits a new kind of person: the informational person. Politically and culturally, we are increasingly defined through an array of information architectures: highly designed environments of data, like our social media profiles, into which we often have to squeeze ourselves. The same is true of identity documents like your passport and individualizing dossiers like your college transcripts. Such architectures capture, code, sort, fasten and analyze a dizzying number of details about us. Our minds are represented by psychological evaluations, education records, credit scores. Our bodies are characterized via medical dossiers, fitness and nutrition tracking regimens, airport security apparatuses. We have become what the privacy theorist Daniel Solove calls “digital persons.” As such we are subject to infopolitics (or what the philosopher Grégoire Chamayou calls “datapower,” the political theorist Davide Panagia “datapolitik” and the pioneering thinker Donna Haraway “informatics of domination”).
Today’s informational person is the culmination of developments stretching back to the late 19th century. It was in those decades that a number of early technologies of informational identity were first assembled. Fingerprinting was implemented in colonial India, then imported to Britain, then exported worldwide. Anthropometry — the measurement of persons to produce identifying records — was developed in France in order to identify recidivists. The registration of births, which has since become profoundly important for initiating identification claims, became standardized in many countries, with Massachusetts pioneering the way in the United States before a census initiative in 1900 led to national standardization. In the same era, bureaucrats visiting rural districts complained that they could not identify individuals whose names changed from context to context, which led to initiatives to universalize standard names. Once fingerprints, biometrics, birth certificates and standardized names were operational, it became possible to implement an international passport system, a social security number and all other manner of paperwork that tells us who someone is. When all that paper ultimately went digital, the reams of data about us became radically more assessable and subject to manipulation, which has made us even more informational.
We like to think of ourselves as somehow apart from all this information. We are real — the information is merely about us. But what is it that is real? What would be left of you if someone took away all your numbers, cards, accounts, dossiers and other informational prostheses? Information is not just about you — it also constitutes who you are….”

Needed: A New Generation of Game Changers to Solve Public Problems


Beth Noveck: “In order to change the way we govern, it is important to train and nurture a new generation of problem solvers who possess the multidisciplinary skills to become effective agents of change. That’s why we at the GovLab have launched The GovLab Academy with the support of the Knight Foundation.
In an effort to help people in their own communities become more effective at developing and implementing creative solutions to compelling challenges, The Gov Lab Academy is offering two new training programs:
1) An online platform with an unbundled and evolving set of topics, modules and instructors on innovations in governance, including themes such as big and open data and crowdsourcing and forthcoming topics on behavioral economics, prizes and challenges, open contracting and performance management for governance;
2) Gov 3.0: A curated and sequenced, 14-week mentoring and training program.
While the online-platform is always freely available, Gov 3.0 begins on January 29, 2014 and we invite you to to participate. Please forward this email to your networks and help us spread the word about the opportunity to participate.
Please consider applying (individuals or teams may apply), if you are:

  • an expert in communications, public policy, law, computer science, engineering, business or design who wants to expand your ability to bring about social change;

  • a public servant who wants to bring innovation to your job;

  • someone with an important idea for positive change but who lacks key skills or resources to realize the vision;

  • interested in joining a network of like-minded, purpose-driven individuals across the country; or

  • someone who is passionate about using technology to solve public problems.

The program includes live instruction and conversation every Wednesday from 5:00– 6:30 PM EST for 14 weeks starting Jan 29, 2014. You will be able to participate remotely via Google Hangout.

Gov 3.0 will allow you to apply evolving technology to the design and implementation of effective solutions to public interest challenges. It will give you an overview of the most current approaches to smarter governance and help you improve your skills in collaboration, communication, and developing and presenting innovative ideas.

Over 14 weeks, you will develop a project and a plan for its implementation, including a long and short description, a presentation deck, a persuasive video and a project blog. Last term’s projects covered such diverse issues as post-Fukushima food safety, science literacy for high schoolers and prison reform for the elderly. In every case, the goal was to identify realistic strategies for making a difference quickly.  You can read the entire Gov 3.0 syllabus here.

The program will include national experts and instructors in technology and governance both as guests and as mentors to help you design your project. Last term’s mentors included current and former officials from the White House and various state, local and international governments, academics from a variety of fields, and prominent philanthropists.

People who complete the program will have the opportunity to apply for a special fellowship to pursue their projects further.

Previously taught only on campus, we are offering Gov 3.0 in beta as an online program. This is not a MOOC. It is a mentoring-intensive coaching experience. To maximize the quality of the experience, enrollment is limited.

Please submit your application by January 22, 2014. Accepted applicants (individuals and teams) will be notified on January 24, 2014. We hope to expand the program in the future so please use the same form to let us know if you would like to be kept informed about future opportunities.”