Paper by Michael J. Burstein and Fiona Murray: “Innovation prizes in reality are significantly different from innovation prizes in theory. The former are familiar from popular accounts of historical prizes like the Longitude Prize: the government offers a set amount for a solution to a known problem, like £20,000 for a method of calculating longitude at sea. The latter are modeled as compensation to inventors in return for donating their inventions to the public domain. Neither the economic literature nor the policy literature that led to the 2010 America COMPETES Reauthorization Act — which made prizes a prominent tool of government innovation policy — provides a satisfying justification for the use of prizes, nor does either literature address their operation. In this article, we address both of these problems. We use a case study of one canonical, high profile innovation prize — the Progressive Insurance Automotive X Prize — to explain how prizes function as institutional means to achieve exogenously defined innovation policy goals in the face of significant uncertainty and information asymmetries. Focusing on the structure and function of actual innovation prizes as an empirical matter enables us to make three theoretical contributions to the current understanding of prizes. First, we offer a stronger normative justification for prizes grounded in their status as a key institutional arrangement for solving a specified innovation problem. Second, we develop a model of innovation prize governance and then situate that model in the administrative state, as a species of “new governance” or “experimental” regulation. Third, we derive from those analyses a novel framework for choosing among prizes, patents, and grants, one in which the ultimate choice depends on a trade off between the efficacy and scalability of the institutional solution….(More)”
Big data, meet behavioral science
Brookings: “America’s community colleges offer the promise of a more affordable pathway to a bachelor’s degree. Students can pay substantially less for the first two years of college, transfer to a four-year college or university, and still earn their diploma in the same amount of time. At least in theory. Most community college students—80 percent of them—enter with the intention to transfer, but only 20 percent actually do so within five years of entering college. This divide represents a classic case of what behavioralists call an intention-action gap.
atWhy would so many students who enter community colleges intending to transfer fail to actually do so? Put yourself in the shoes of a 20-something community college student. You’ve worked hard for the past couple years, earning credits and paying a lot less in tuition than you would have if you had enrolled immediately in a four-year college or university. But now you want to transfer, so that you can complete your bachelor’s degree. How do you figure out where to go? Ideally you’d probably like to find a college that would take most of your credits, where you’re likely to graduate from, and where the degree is going to count for something in the labor market. A college advisor could probably help you figure this out,but at many community colleges there are at least 1,000 other students assigned to your advisor, so you might have a hard time getting a quality meeting. Some states have articulation agreements between two- and four-year institutions that guarantee admission for students who complete certain course sequences and perform at a high enough level. But these agreements are often dense and inaccessible.
The combination of big data and behavioral insights has the potential to help students navigate these complex decisions and successfully follow through on their intentions. Big data analytic techniques allow us to identify concrete transfer pathways where students are positioned to succeed; behavioral insights ensure we communicate these options in a way that maximizes students’ engagement and responsiveness…..A growing body of innovative research has demonstrated that, by applying behavioral science insights to the way we communicate with students and families about the opportunities and resources available to them, we can help people navigate these complex decisions and experience better outcomes as a result. A combination of simplified information, reminders, and access to assistance have improved achievement and attainment up and down the education pipeline, nudging parents to practice early-literacy activities with their kids or check in with their high schoolers about missed assignments, andencouraging students to renew their financial aid for college….
These types of big data techniques are already being used in some education sectors. For instance, a growing number of colleges use predictive analytics to identify struggling students who need additional assistance, so faculty and administrators can intervene before the student drops out. But frequently there is insufficient attention, once the results of these predictive analyses are in hand, about how to communicate the information in a way that is likely to lead to behavior change among students or educators. And much of the predictive analytics work has been on the side of plugging leaks in the pipeline (e.g. preventing drop-outs from higher education), rather than on the side of proactively sending students and families personalized information about educational and career pathways where they are likely to flourish…(More)”
Accelerating Discovery with New Tools and Methods for Next Generation Social Science
DARPA: “The explosive growth of global digital connectivity has opened new possibilities for designing and conducting social science research. Once limited by practical constraints to experiments involving just a few dozen participants—often university students or other easily available groups—or to correlational studies of large datasets without any opportunity for determining causation, scientists can now engage thousands of diverse volunteers online and explore an expanded range of important topics and questions. If new tools and methods for harnessing virtual or alternate reality and massively distributed platforms could be developed and objectively validated, many of today’s most important and vexing challenges in social science—such as identifying the primary drivers of social cooperation, instability and resilience—might be made more tractable, with benefits for domains as broad as national security, public health, and economics.
To begin to assess the research opportunities provided by today’s web-connected world and advanced technologies, DARPA today launched its Next Generation Social Science (NGS2) program. The program aims to build and evaluate new methods and tools to advance rigorous, reproducible social science studies at scales necessary to develop and validate causal models of human social behaviors. The program will draw upon and build across a wide array of disciplines—including social sciences like sociology, economics, political science, anthropology, and psychology, as well as information and computer sciences, physics, biology and math.
As an initial focus, NGS2 will challenge researchers to develop and use these new tools and methods to identify causal mechanisms of “collective identity” formation—how a group of individuals becomes a unified whole, and how under certain circumstances that community breaks down into a chaotic mix of disconnected individuals.
“Social science has done a remarkable job of helping us understand ourselves as the highly social creatures we are, but the field has long acknowledged and rued some frustrating research limitations, including technical and logistical limits to experimentally studying large, representative populations and the challenges of replicating key studies to better understand the limits of our knowledge,” said Adam Russell, DARPA program manager. “As a result, it’s been difficult for social scientists to determine what variables matter most in explaining their observations of human social systems and to move from documenting correlation to identifying causation.”
On top of those methodological and analytic limitations, Russell said, the field is inherently challenged because of its subject matter: human beings, with all their complex variability and seeming unpredictability. “Physicists have joked about how much more difficult their field would be if atoms or electrons had personalities, but that’s exactly the situation faced by social scientists,” he said.
By developing and applying new methods and models to larger, more diverse, and more representative groups of individuals—such as through web-based global gaming and alternate reality platforms—NGS2 seeks to validate new tools that may empower social science in the same way that sophisticated telescopes and microscopes have helped advance astronomy and biology….(More)”
Revolutionizing Innovation: Users, Communities, and Open Innovation
Book edited by Dietmar Harhoff and Karim R. Lakhani: “The last two decades have witnessed an extraordinary growth of new models of managing and organizing the innovation process that emphasizes users over producers. Large parts of the knowledge economy now routinely rely on users, communities, and open innovation approaches to solve important technological and organizational problems. This view of innovation, pioneered by the economist Eric von Hippel, counters the dominant paradigm, which cast the profit-seeking incentives of firms as the main driver of technical change. In a series of influential writings, von Hippel and colleagues found empirical evidence that flatly contradicted the producer-centered model of innovation. Since then, the study of user-driven innovation has continued and expanded, with further empirical exploration of a distributed model of innovation that includes communities and platforms in a variety of contexts and with the development of theory to explain the economic underpinnings of this still emerging paradigm. This volume provides a comprehensive and multidisciplinary view of the field of user and open innovation, reflecting advances in the field over the last several decades.
The contributors—including many colleagues of Eric von Hippel—offer both theoretical and empirical perspectives from such diverse fields as economics, the history of science and technology, law, management, and policy. The empirical contexts for their studies range from household goods to financial services. After discussing the fundamentals of user innovation, the contributors cover communities and innovation; legal aspects of user and community innovation; new roles for user innovators; user interactions with firms; and user innovation in practice, describing experiments, toolkits, and crowdsourcing, and crowdfunding…(More)”
How Google Optimized Healthy Office Snacks
Zoe Chance, Ravi Dhar, Michelle Hatzis and Michiel Bakker at Harvard Business Review: “Employers need simple, low-cost ways of helping employees make healthy choices. The effects of poor health and obesity cost U.S. companies $225 billion every year, according to the Centers for Disease Control, and this number is quickly rising. Although some employer-sponsored wellness programs have yielded high returns — Johnson & Johnson reported a 170% return on wellness spending in the 2000s — the employee wellness industry as a whole has struggled to prove its value.
Wellness initiatives often fail because they rely on outdated methods of engagement, placing too much emphasis on providing information. Extensive evidence from behavioral economics has shown that information rarely succeeds in changing behavior or building new habits for fitness and food choices. Telling people why and how to improve their health fails to elicit behavior changes because behavior often diverges from intentions. This is particularly true for food choices because our self-control is taxed by any type of depletion, including hunger. And the necessity of making food decisions many times a day means we can’t devote much processing power to each choice, so our eating behaviors tend to be habit- and instinct-driven. With a clearer understanding of the influences on choice — context and impulsivity, for instance — companies can design environments that reinforce employees’ healthy choices, limit potential lapses, and save on health care costs.
Jointly, the Google Food Team and the Yale Center for Customer Insights have been studying how behavioral economics can improve employee health choices. We’ve run multiple field experiments to understand how small “tweaks” can nudge behavior toward desirable outcomes and yield outsized benefits. To guide these interventions, we distilled scattered findings from behavioral science into a simple framework, the four Ps of behavior change:
- Process
- Persuasion
- Possibilities
- Person
The framework helped us structure a portfolio of strategies for making healthy choices easier and more enticing and making unhealthy choices harder and less tempting. Below, we present a brief example of each point of intervention….(More)”
Design for policy and public services
The Centre for Public Impact: “Traditional approaches to policymaking have left policymakers and citizens looking for alternative solutions. Despite the best of intentions, the standard model of dispassionate expert analysis and subsequent implementation by a professional bureaucracy has, generally, led to siloed solutions and outcomes for citizens that fall short of what might be possible.
The discipline of design may well provide an answer to this problem by offering a collection of methods which allow civil servants to generate insights based on citizens’ needs, aspirations and behaviours. In doing so, it changes the view of citizens from seeing them as anonymous entities to complex humans with complex needs to match. The potential of this new approach is already becoming clear – just ask the medical teams and patients at Norway’s Oslo University Hospital. Women with a heightened risk of developing breast cancer had previously been forced to wait up to three months before receiving an appointment for examination and diagnosis. A redesign reduced this wait to just three days.
In-depth user research identified the principal issues and pinpointed the lack of information about the referral process as a critical problem. The designers also interviewed 40 hospital employees of all levels to find out about their daily schedules and processes. Governments have always drawn inspiration from fields such as sociology and economics. Design methods are not (yet) part of the policymaking canon, but such examples help explain why this may be about to change….(More)”
Private Data and Public Value: Governance, Green Consumption, and Sustainable Supply Chains
Book edited by Jarman, Holly and Luna-Reyes, Luis F: “This book investigates the ways in which these systems can promote public value by encouraging the disclosure and reuse of privately-held data in ways that support collective values such as environmental sustainability. Supported by funding from the National Science Foundation, the authors’ research team has been working on one such system, designed to enhance consumers ability to access information about the sustainability of the products that they buy and the supply chains that produce them. Pulled by rapidly developing technology and pushed by budget cuts, politicians and public managers are attempting to find ways to increase the public value of their actions. Policymakers are increasingly acknowledging the potential that lies in publicly disclosing more of the data that they hold, as well as incentivizing individuals and organizations to access, use, and combine it in new ways. Due to technological advances which include smarter phones, better ways to track objects and people as they travel, and more efficient data processing, it is now possible to build systems which use shared, transparent data in creative ways. The book adds to the current conversation among academics and practitioners about how to promote public value through data disclosure, focusing particularly on the roles that governments, businesses and non-profit actors can play in this process, making it of interest to both scholars and policy-makers….(More)”
Citizen Science and the Flint Water Crisis
The Wilson Center’s Commons Lab: “In April 2014, the city of Flint, Michigan decided to switch its water supply source from the Detroit water system to a cheaper alternative, the Flint River. But in exchange for the cheaper price tag, the Flint residents paid a greater price with one of the worst public health crises of the past decade.
Despite concerns from Flint citizens about the quality of the water, the Michigan Department of Environmental Quality repeatedly attributed the problem to the plumbing system. It was 37-year-old mother of four, LeeAnne Walters who, after noticing physical and behavioral changes in her children and herself, set off a chain of events that exposed the national scandal. Eventually, with the support of Dr. Marc Edwards, an environmental engineering professor at Virginia Tech (VT), Walters discovered lead concentration levels of 13,200 parts per billion in her water, 880 times the maximum concentration allowed by law and more than twice the level the Environmental Protection Agency considers to be hazardous waste.
Citizen science emerged as an important piece of combating the Flint water crisis. Alarmed by the government’s neglect and the health issues spreading all across Flint, Edwards and Walters began the Flint Water Study, a collaboration between the Flint residents and research team from VT. Using citizen science, the VT researchers provided the Flint residents with kits to sample and test their homes’ drinking water and then analyzed the results to unearth the truth behind Flint’s water quality.
The citizen-driven project illustrates the capacity for nonprofessional scientists to use science in order to address problems that directly affect themselves and their community. While the VT team needed the Flint residents to provide water samples, the Flint residents in turn needed the VT team to conduct the analysis. In short, both parties achieved mutually beneficial results and the partnership helped expose the scandal. Surprisingly, the “traditional” problems associated with citizen science, including the inability to mobilize the local constituent base and the lack of collaboration between citizens and professional scientists, were not the obstacles in Flint….(More)”
Ebola: A Big Data Disaster
Study by Sean Martin McDonald: “…undertaken with support from the Open Society Foundation, Ford Foundation, and Media Democracy Fund, explores the use of Big Data in the form of Call Detail Record (CDR) data in humanitarian crisis.
It discusses the challenges of digital humanitarian coordination in health emergencies like the Ebola outbreak in West Africa, and the marked tension in the debate around experimentation with humanitarian technologies and the impact on privacy. McDonald’s research focuses on the two primary legal and human rights frameworks, privacy and property, to question the impact of unregulated use of CDR’s on human rights. It also highlights how the diffusion of data science to the realm of international development constitutes a genuine opportunity to bring powerful new tools to fight crisis and emergencies.
Analysing the risks of using CDRs to perform migration analysis and contact tracing without user consent, as well as the application of big data to disease surveillance is an important entry point into the debate around use of Big Data for development and humanitarian aid. The paper also raises crucial questions of legal significance about the access to information, the limitation of data sharing, and the concept of proportionality in privacy invasion in the public good. These issues hold great relevance in today’s time where big data and its emerging role for development, involving its actual and potential uses as well as harms is under consideration across the world.
The paper highlights the absence of a dialogue around the significant legal risks posed by the collection, use, and international transfer of personally identifiable data and humanitarian information, and the grey areas around assumptions of public good. The paper calls for a critical discussion around the experimental nature of data modelling in emergency response due to mismanagement of information has been largely emphasized to protect the contours of human rights….
See Sean Martin McDonald – “Ebola: A Big Data Disaster” (PDF).
Data Collaboratives: Matching Demand with Supply of (Corporate) Data to solve Public Problems
Blog by Stefaan G. Verhulst, IrynaSusha and Alexander Kostura: “Data Collaboratives refer to a new form of collaboration, beyond the public-private partnership model, in which participants from different sectors (private companies, research institutions, and government agencies) share data to help solve public problems. Several of society’s greatest challenges — from climate change to poverty — require greater access to big (but not always open) data sets, more cross-sector collaboration, and increased capacity for data analysis. Participants at the workshop and breakout session explored the various ways in which data collaborative can help meet these needs.
Matching supply and demand of data emerged as one of the most important and overarching issues facing the big and open data communities. Participants agreed that more experimentation is needed so that new, innovative and more successful models of data sharing can be identified.
How to discover and enable such models? When asked how the international community might foster greater experimentation, participants indicated the need to develop the following:
· A responsible data framework that serves to build trust in sharing data would be based upon existing frameworks but also accommodates emerging technologies and practices. It would also need to be sensitive to public opinion and perception.
· Increased insight into different business models that may facilitate the sharing of data. As experimentation continues, the data community should map emerging practices and models of sharing so that successful cases can be replicated.
· Capacity to tap into the potential value of data. On the demand side,capacity refers to the ability to pose good questions, understand current data limitations, and seek new data sets responsibly. On the supply side, this means seeking shared value in collaboration, thinking creatively about public use of private data, and establishing norms of responsibility around security, privacy, and anonymity.
· Transparent stock of available data supply, including an inventory of what corporate data exist that can match multiple demands and that is shared through established networks and new collaborative institutional structures.
· Mapping emerging practices and models of sharing. Corporate data offers value not only for humanitarian action (which was a particular focus at the conference) but also for a variety of other domains, including science,agriculture, health care, urban development, environment, media and arts,and others. Gaining insight in the practices that emerge across sectors could broaden the spectrum of what is feasible and how.
In general, it was felt that understanding the business models underlying data collaboratives is of utmost importance in order to achieve win-win outcomes for both private and public sector players. Moreover, issues of public perception and trust were raised as important concerns of government organizations participating in data collaboratives….(More)”