What works?


Dan Davies at TheLong & Short: “Evidence-based policy is very much in fashion at the moment in all departments of government. Of course it’s a good idea; the main argument for it is summarised admirably by the name. But people who expect big things from evidence-based approaches ought to be really quite worried right now.

Because the methodology used in a lot of evidence-based policy analysis is very similar to that used in experimental psychology. And at the moment, psychology is a subject with some very serious methodological problems.

It’s being called the ‘reproducibility crisis’ and in summary, the problem is that large-scale and careful attempts to replicate some of the best-established and most important results of the last few decades are not finding the effects they were meant to find. This is even happening for effects like ‘ego depletion’ (the idea that resisting temptation requires effort and makes it harder to exercise willpower), which are the subject of dozens or even hundreds of research papers.

There appear to be two related problems. First, there is a knot of issues relating to methodology and the interpretation of statistical tests, which means that there is a systematic tendency to find too many statistically significant results. And second, it turns out that a lot of psychology results are just ‘fragile’ – they describe much smaller sets of individuals than hoped, and are very dependent on particular situations, rather than reflecting broad truths about humanity.

Both of these problems are likely to be shared by a lot of other areas. For example, the methodology of behavioural economics has a very big overlap with experimental psychology, and is likely to have many of the same reproducibility issues. So lots of ‘nudge’ schemes related to savings and pensions could be based on fragile results….(More)”

Innovation Prizes in Practice and Theory


Paper by Michael J. Burstein and Fiona Murray: “Innovation prizes in reality are significantly different from innovation prizes in theory. The former are familiar from popular accounts of historical prizes like the Longitude Prize: the government offers a set amount for a solution to a known problem, like £20,000 for a method of calculating longitude at sea. The latter are modeled as compensation to inventors in return for donating their inventions to the public domain. Neither the economic literature nor the policy literature that led to the 2010 America COMPETES Reauthorization Act — which made prizes a prominent tool of government innovation policy — provides a satisfying justification for the use of prizes, nor does either literature address their operation. In this article, we address both of these problems. We use a case study of one canonical, high profile innovation prize — the Progressive Insurance Automotive X Prize — to explain how prizes function as institutional means to achieve exogenously defined innovation policy goals in the face of significant uncertainty and information asymmetries. Focusing on the structure and function of actual innovation prizes as an empirical matter enables us to make three theoretical contributions to the current understanding of prizes. First, we offer a stronger normative justification for prizes grounded in their status as a key institutional arrangement for solving a specified innovation problem. Second, we develop a model of innovation prize governance and then situate that model in the administrative state, as a species of “new governance” or “experimental” regulation. Third, we derive from those analyses a novel framework for choosing among prizes, patents, and grants, one in which the ultimate choice depends on a trade off between the efficacy and scalability of the institutional solution….(More)”

Participatory Budgeting


Nudging voters


John Hasnas and Annette Hasnas at the Hill: “A perennial complaint about our democracy is that too large a portion of the electorate is poorly informed about important political issues. This is the problem of the ignorant voter. Especially this year, with its multiplicity of candidates, keeping track of the candidates’ various, and often shifting, policy positions can be extraordinarily difficult. As a result, many of those voting in the presidential primaries will cast their ballots with little idea of where the candidates stand on several important issues.

Isn’t there some way to nudge the voters into making more informed choices? Well, actually, yes, there is. But in making this claim, we use the word nudge advisedly.

Among contemporary policy analysts, “nudge” is a term of art. It refers to creating a context within which people make choices–a “choice architecture”–that makes it more likely that people will select one option rather than another. The typical example of a nudge is a school cafeteria in which fruits and vegetables are placed in front in easy to reach locations and less healthy fare is placed in less visible and harder to reach locations. No one is forced to select the fruit or vegetables, but the choice architecture makes it more likely that people will.
The key feature of a nudge is that it is not coercive. It is an effort to influence choice, not to impose it. People are always able to “opt out” of the nudge. Thus, to nudge is to design the context in which individuals make decisions so as to influence their choice without eliminating any options.

We think that nudging can be employed to help voters make more informed decisions in the voting booth.

Imagine the following scenario. A bipartisan good government group creates a list of the most significant contemporary policy issues. It then invites all candidates to state their positions on the issues. In the current campaign, candidates could be invited to state where they stand on gay marriage, immigration, intervention in Syria, climate change, tax reform, the minimum wage, gun control, income inequality, etc. This information would be collected and fed into the relevant election commission computer. When voters enter the voting booth, they would have the option of electronically recording their policy preferences on the same form that the candidates completed. The computer would display a ranking of the candidates on the basis of how closely their positions aligned with the voter’s. After receiving this information, voters would cast their ballots.

Our proposal is a nudge. It is completely non-coercive. No candidate would be required to complete the list of his or her policy positions, although refusing to do so might be viewed negatively by voters. No voter would be required to utilize the option. All would remain free to simply walk into the booth and cast their vote. Even those who utilize the option remain free to disregard its results and vote for whomever they please. The proposal simply alters the choice architecture of voting to build in access to a source of information about the candidates. Yet, it makes it more likely that citizens will cast more informed votes than they do at present….(More)”

Big data, meet behavioral science


 at Brookings: “America’s community colleges offer the promise of a more affordable pathway to a bachelor’s degree. Students can pay substantially less for the first two years of college, transfer to a four-year college or university, and still earn their diploma in the same amount of time. At least in theory. Most community college students—80 percent of them—enter with the intention to transfer, but only 20 percent actually do so within five years of entering college. This divide represents a classic case of what behavioralists call an intention-action gap.

Why would so many students who enter community colleges intending to transfer fail to actually do so? Put yourself in the shoes of a 20-something community college student. You’ve worked hard for the past couple years, earning credits and paying a lot less in tuition than you would have if you had enrolled immediately in a four-year college or university. But now you want to transfer, so that you can complete your bachelor’s degree. How do you figure out where to go? Ideally you’d probably like to find a college that would take most of your credits, where you’re likely to graduate from, and where the degree is going to count for something in the labor market. A college advisor could probably help you figure this out,but at many community colleges there are at least 1,000 other students assigned to your advisor, so you might have a hard time getting a quality meeting.  Some states have articulation agreements between two- and four-year institutions that guarantee admission for students who complete certain course sequences and perform at a high enough level. But these agreements are often dense and inaccessible.

The combination of big data and behavioral insights has the potential to help students navigate these complex decisions and successfully follow through on their intentions. Big data analytic techniques allow us to identify concrete transfer pathways where students are positioned to succeed; behavioral insights ensure we communicate these options in a way that maximizes students’ engagement and responsiveness…..A growing body of innovative research has demonstrated that, by applying behavioral science insights to the way we communicate with students and families about the opportunities and resources available to them, we can help people navigate these complex decisions and experience better outcomes as a result. A combination of simplified information, reminders, and access to assistance have improved achievement and attainment up and down the education pipeline, nudging parents to practice early-literacy activities with their kids or check in with their high schoolers about missed assignments, andencouraging students to renew their financial aid for college….

These types of big data techniques are already being used in some education sectors. For instance, a growing number of colleges use predictive analytics to identify struggling students who need additional assistance, so faculty and administrators can intervene before the student drops out. But frequently there is insufficient attention, once the results of these predictive analyses are in hand, about how to communicate the information in a way that is likely to lead to behavior change among students or educators. And much of the predictive analytics work has been on the side of plugging leaks in the pipeline (e.g. preventing drop-outs from higher education), rather than on the side of proactively sending students and families personalized information about educational and career pathways where they are likely to flourish…(More)”

Guidance for Developing a Local Digital Response Network


Guide by Jenny Phillips and Andrej Verity: “…Beyond the obvious desire to create the guidance document, we had three objectives when drafting:

  1. Cover the core aspect. Six pages of concrete questions, answers and suggestions are designed to help ensure that start-up activities are well informed.
  2. Keep it as simple and light as possible. We wanted something that an individual could quickly consume, yet find a valuable resource.
  3. Feed into larger projects. By creating something concrete, we hope that it would feed into larger initiatives like Heather Leason and Willow Brugh’s effort to build out a Digital Responders Handbook.

So, are you a passionate individual who wants to help harness local digitally-enabled volunteers or groups in response to emergencies? Would you like to become a central figure and coordinate these groups so that any response is more than the sum of all its parts? If this describes your desire and you answered the questions positively, then this guidance is for you! Create a local Digital Response Network. And, welcome to the world of digital humanitarian response…(More)”

How Google Optimized Healthy Office Snacks


Zoe ChanceRavi DharMichelle Hatzis and Michiel Bakker at Harvard Business Review: “Employers need simple, low-cost ways of helping employees make healthy choices. The effects of poor health and obesity cost U.S. companies $225 billion every year, according to the Centers for Disease Control, and this number is quickly rising. Although some employer-sponsored wellness programs have yielded high returns — Johnson & Johnson reported a 170% return on wellness spending in the 2000s — the employee wellness industry as a whole has struggled to prove its value.

 

Wellness initiatives often fail because they rely on outdated methods of engagement, placing too much emphasis on providing information. Extensive evidence from behavioral economics has shown that information rarely succeeds in changing behavior or building new habits for fitness and food choices. Telling people why and how to improve their health fails to elicit behavior changes because behavior often diverges from intentions. This is particularly true for food choices because our self-control is taxed by any type of depletion, including hunger. And the necessity of making food decisions many times a day means we can’t devote much processing power to each choice, so our eating behaviors tend to be habit- and instinct-driven. With a clearer understanding of the influences on choice — context and impulsivity, for instance — companies can design environments that reinforce employees’ healthy choices, limit potential lapses, and save on health care costs.

Jointly, the Google Food Team and the Yale Center for Customer Insights have been studying how behavioral economics can improve employee health choices. We’ve run multiple field experiments to understand how small “tweaks” can nudge behavior toward desirable outcomes and yield outsized benefits. To guide these interventions, we distilled scattered findings from behavioral science into a simple framework, the four Ps of behavior change:

  • Process
  • Persuasion
  • Possibilities
  • Person

The framework helped us structure a portfolio of strategies for making healthy choices easier and more enticing and making unhealthy choices harder and less tempting. Below, we present a brief example of each point of intervention….(More)”

Research and Evaluation of Participatory Budgeting in the U.S. and Canada


Public Agenda: “Communities across the country are experimenting with participatory budgeting (PB), a democratic process in which residents decide together how to spend part of a public budget. Learning more about how these community efforts are implemented and with what results will help improve and expand successful forms of participatory budgeting across the U.S. and Canada.

Public Agenda is supporting local evaluation efforts and sharing research on participatory budgeting. Specifically, we are:

  • Building a community of practice among PB evaluators and researchers.
  • Working with evaluators and researchers to make data and research findings comparable across communities that use participatory budgeting.
  • Developing key metrics and research tools to help evaluate participatory budgeting (download these documents here).
  • Publishing a “Year in Participatory Budgeting Research” review based on data, findings, experiences and challenges from sites in the U.S. and Canada.
  • Conducting original, independent research on elected officials’ views of and experiences with participatory budgeting.
  • Convening the North American Participatory Budgeting Research Board.

…Below, you will find evaluation tools and resources we developed in close collaboration with PB evaluators and researchers in the U.S. and Canada. We also included the local evaluation reports from communities around the U.S. and Canada using PB in budget decisions.

To be the first to hear about new PB resources and news, join our email list. We also invite you to email us to join our listserv and participate in discussion about evaluation and research of participatory budgeting in the U.S. and Canada.

New to PB and looking to introduce it to your community? You should start here instead! Once your PB effort is under way, come back to this page for tools to evaluate how you’re doing.

15 Key Metrics for Evaluating Participatory Budgeting: A Toolkit for Evaluators and Implementers

Evaluation is a critical component of any PB effort. Systematic and formal evaluation can help people who introduce, implement, participate in or otherwise have a stake in PB understand how participatory budgeting is growing, what its reach is, and how it’s impacting the community and beyond.

We developed the 15 Key Metrics for Evaluating Participatory Budgeting toolkit for people interested in evaluating PB efforts in their communities. It is meant to encourage and support some common research goals across PB sites and meaningfully inform local and national discussions about PB in the U.S. and Canada. It is the first iteration of such a toolkit and especially focused on providing practical and realistic guidance for the evaluation of new and relatively new PB processes.

Anyone involved in public engagement or participation efforts other than participatory budgeting may also be interested in reviewing the toolkit for research and evaluation ideas.

The toolkit requires registration before you can download.

The toolkit includes the following sections:

15 Key Metrics for Evaluating Participatory Budgeting: 15 indicators (“metrics”) that capture important elements of each community-based PB process and the PB movement in North America overall. Click here for a brief description of these metrics….(More)”

Design for policy and public services


The Centre for Public Impact: “Traditional approaches to policymaking have left policymakers and citizens looking for alternative solutions. Despite the best of intentions, the standard model of dispassionate expert analysis and subsequent implementation by a professional bureaucracy has, generally, led to siloed solutions and outcomes for citizens that fall short of what might be possible.

The discipline of design may well provide an answer to this problem by offering a collection of methods which allow civil servants to generate insights based on citizens’ needs, aspirations and behaviours. In doing so, it changes the view of citizens from seeing them as anonymous entities to complex humans with complex needs to match. The potential of this new approach is already becoming clear – just ask the medical teams and patients at Norway’s Oslo University Hospital. Women with a heightened risk of developing breast cancer had previously been forced to wait up to three months before receiving an appointment for examination and diagnosis. A redesign reduced this wait to just three days.

In-depth user research identified the principal issues and pinpointed the lack of information about the referral process as a critical problem. The designers also interviewed 40 hospital employees of all levels to find out about their daily schedules and processes. Governments have always drawn inspiration from fields such as sociology and economics. Design methods are not (yet) part of the policymaking canon, but such examples help explain why this may be about to change….(More)”Screen Shot 2016-03-07 at 8.52.52 AM

Value public information so we can trust it, rely on it and use it


Speech by David Fricker, the director general of the National Archives of Australia: “No-one can deny that we are in an age of information abundance. More and more we rely on information from a variety of sources and channels. Digital information is seductive, because it’s immediate, available and easy to move around. But digital information can be used for nefarious purposes. Social issues can be at odds with processes of government in this digital age. There is a tension between what is the information, where it comes from and how it’s going to be used.

How do we know if the information has reached us without being changed, whether that’s intentional or not?

How do we know that government digital information will be the authoritative source when the pace of information exchange is so rapid? In short, how do we know what to trust?

“It’s everyone’s responsibly to contribute to a transparent government, and that means changes in our thinking and in our actions.”

Consider the challenges and risks that come with the digital age: what does it really mean to have transparency and integrity of government in today’s digital environment?…

What does the digital age mean for government? Government should be delivering services online, which means thinking about location, timeliness and information accessibility. It’s about getting public-sector data out there, into the public, making it available to fuel the digital economy. And it’s about a process of change across government to make sure that we’re breaking down all of those silos, and the duplication and fragmentation which exist across government agencies in the application of information, communications, and technology…..

The digital age is about the digital economy, it’s about rethinking the economy of the nation through the lens of information that enables it. It’s understanding that a nation will be enriched, in terms of culture life, prosperity and rights, if we embrace the digital economy. And that’s a weighty responsibility. But the responsibility is not mine alone. It’s a responsibility of everyone in the government who makes records in their daily work. It’s everyone’s responsibly to contribute to a transparent government. And that means changes in our thinking and in our actions….

What has changed about democracy in the digital age? Once upon a time if you wanted to express your anger about something, you might write a letter to the editor of the paper, to the government department, or to your local member and then expect some sort of an argument or discussion as a response. Now, you can bypass all of that. You might post an inflammatory tweet or blog, your comment gathers momentum, you pick the right hashtag, and off we go. It’s all happening: you’re trending on Twitter…..

If I turn to transparency now, at the top of the list is the basic recognition that government information is public information. The information of the government belongs to the people who elected that government. It’s a fundamental of democratic values. It also means that there’s got to be more public participation in the development of public policy, which means if you’re going to have evidence-based, informed, policy development; government information has to be available, anywhere, anytime….

Good information governance is at the heart of managing digital information to provide access to that information into the future — ready access to government information is vital for transparency. Only when information is digital and managed well can government share it effectively with the Australian community, to the benefit of society and the economy.

There are many examples where poor information management, or poor information governance, has led to failures — both in the private and public sectors. Professor Peter Shergold’s recent report, Learning from Failure, why large government policy initiatives have gone so badly wrong in the past and how the chances of success in the future can be improved, highlights examples such as the Home Insulation Program, the NBN and Building the Education Revolution….(Full Speech)