Index: Prizes and Challenges


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on prizes and challenges and was originally published in 2015.

This index highlights recent findings about two key techniques in shifting innovation from institutions to the general public:

  • Prize-Induced Contests – using monetary rewards to incentivize individuals and other entities to develop solutions to public problems; and
  • Grand Challenges – posing large, audacious goals to the public to spur collaborative, non-governmental efforts to solve them.

You can read more about Governing through Prizes and Challenges here. You can also watch Alph Bingham, co-founder of Innocentive, answer the GovLab’s questions about challenge authoring and defining the problem here.

Previous installments of the Index include Measuring Impact with Evidence, The Data Universe, Participation and Civic Engagement and Trust in Institutions. Please share any additional statistics and research findings on the intersection of technology in governance with us by emailing shruti at thegovlab.org.

Prize-Induced Contests

  • Year the British Government introduced the Longitude Prize, one of the first instances of prizes by government to spur innovation: 1714
  • President Obama calls on “all agencies to increase their use of prizes to address some of our Nation’s most pressing challenges” in his Strategy for American Innovation: September 2009
  • The US Office of Management and Budget issues “a policy framework to guide agencies in using prizes to mobilize American ingenuity and advance their respective core missions”:  March 2010
  • Launch of Challenge.gov, “a one-stop shop where entrepreneurs and citizen solvers can find public-sector prize competitions”: September 2010
    • Number of competitions currently live on Challenge.gov in February 2015: 22 of 399 total
    • How many competitions on Challenge.gov are for $1 million or above: 23
  • The America COMPETES Reauthorization Act is introduced, which grants “all Federal agencies authority to conduct prize competitions to spur innovation, solve tough problems, and advance their core missions”: 2010
  • Value of prizes authorized by COMPETES: prizes up to $50 million
  • Fact Sheet and Frequently Asked Questions memorandum issued by the Office of Science and Technology Policy and the Office of Management and Budget to aid agencies to take advantage of authorities in COMPETES: August 2011
  • Number of prize competitions run by the Federal government from 2010 to April 2012: 150
  • How many Federal agencies have run prize competitions by 2012: 40
  • Prior to 1991, the percentage of prize money that recognized prior achievements according to an analysis by McKinsey and Company: 97%
    • Since 1991, percentage of new prize money that “has been dedicated to inducement-style prizes that focus on achieving a specific, future goal”: 78%
  • Value of the prize sector as estimated by McKinsey in 2009: $1-2 billion
  • Growth rate of the total value of new prizes: 18% annually
  • Growth rate in charitable giving in the US: 2.5% annually
  • Value of the first Horizon Prize awarded in 2014 by the European Commission to German biopharmaceutical company CureVac GmbH “for progress towards a novel technology to bring life-saving vaccines to people across the planet in safe and affordable ways”: €2 million
  • Number of solvers registered on InnoCentive, a crowdsourcing company: 355,000+ from nearly 200 countries
    • Total Challenges Posted: 2,000+ External Challenges
    • Total Solution Submissions: 40,000+
    • Value of the awards: $5,000 to $1+ million
    • Success Rate for premium challenges: 85%

Grand Challenges

  • Value of the Progressive Insurance Automotive X Prize, sponsored in part by DOE to develop production-capable super fuel-efficient vehicles: $10 million
    • Number of teams around the world who took part in the challenge “to develop a new generation of technologies” for production-capable super fuel-efficient vehicles: 111 teams
  • Time it took for the Air Force Research Laboratory to receive a workable solution on “a problem that had vexed military security forces and civilian police for years” by opening the challenge to the world: 60 days
  • Value of the HHS Investing in Innovation initiative to spur innovation in Health IT, launched under the new COMPETES act: $5 million program
  • Number of responses received by NASA for its Asteroid Grand Challenge RFI which seeks to identify and address all asteroid threats to the human population: over 400
  • The decreased cost of sequencing a single human genome as a result of the Human Genome Project Grand Challenge: $7000 from $100 million
  • Amount the Human Genome Project Grand Challenge has contributed to the US economy for every $1 invested by the US federal government: $141 for every $1 invested
  • The amount of funding for research available for the “Brain Initiative,” a collaboration between the National Institute of Health, DARPA and the National Science Foundation, which seeks to uncover new prevention and treatment methods for brain disorders like Alzheimer’s, autism and schizophrenia: $100 million
  • Total amount offered in cash awards by the Department of Energy’s “SunShot Grand Challenge,” which seeks to eliminate the cost disparity between solar energy and coal by the end of the decade: $10 million

Sources

Encyclopedia of Social Network Analysis and Mining


“The Encyclopedia of Social Network Analysis and Mining (ESNAM) is the first major reference work to integrate fundamental concepts and research directions in the areas of social networks and  applications to data mining. While ESNAM  reflects the state-of-the-art in  social network research, the field  had its start in the 1930s when fundamental issues in social network research were broadly defined. These communities were limited to relatively small numbers of nodes (actors) and links. More recently the advent of electronic communication, and in particular on-line communities, have created social networks of hitherto unimaginable sizes. People around the world are directly or indirectly connected by popular social networks established using web-based platforms rather than by physical proximity.

Reflecting the interdisciplinary nature of this unique field, the essential contributions of diverse disciplines, from computer science, mathematics, and statistics to sociology and behavioral science, are described among the 300 authoritative yet highly readable entries. Students will find a world of information and insight behind the familiar façade of the social networks in which they participate. Researchers and practitioners will benefit from a comprehensive perspective on the methodologies for analysis of constructed networks, and the data mining and machine learning techniques that have proved attractive for sophisticated knowledge discovery in complex applications. Also addressed is the application of social network methodologies to other domains, such as web networks and biological networks….(More)”

Evaluating Complex Social Initiatives


Srik Gopal at Stanford Social Innovation Review: “…the California Endowment (TCE) .. and ..The Grand Rapids Community Foundation (GRCF) …are just two funders who are actively “shifting the lens” in terms of how they are looking at evaluation in the light of complexity. They are building on the recognition that a traditional approach to evaluation—assessing specific effects of a defined program according to a set of pre-determined outcomes, often in a way that connects those outcomes back to the initiative—is increasingly falling short. There is a clear message emerging that evaluation needs to accommodate complexity, not “assume it away.”

My colleagues at FSG and I have, based on our work with TCE, GRCF, and numerous other clients, articulated a set of nine “propositions” in a recent practice brief that are helpful in guiding how we conceptualize, design, and implement evaluations of complex initiatives. We derived these propositions from what we now know (based on the emerging field of complexity science) as distinctive characteristics of complex systems. We know, for example, that complex systems are always changing, often in unpredictable ways; they are never static. Hence, we need to design evaluations so that they are adaptive, flexible, and iterative, not rigid and cookie-cutter.

Below are three of the propositions in more detail, along with tools and methods that can help apply the proposition in practice.

It is important to note that many of the traditional tools and methods that form the backbone of sound evaluations—such as interviews, focus groups, and surveys—are still relevant. We would, however, suggest that organizations adapt those methods to reflect a complexity orientation. For example, interviews should explore the role of context; we should not confine them to the initiative’s boundaries. Focus groups should seek to understand local adaptation, not just adherence. And surveys should probe for relationships and interdependencies, not just perceived outcomes. In addition to traditional methods, we suggest incorporating newer, innovative techniques that provide a richer narrative, including:

  • Systems mapping—an iterative, often participatory process of graphically representing a system, including its components and connections
  • Appreciative inquiry—a group process that inquires into, identifies, and further develops the best of “what is” in organizations
  • Design thinking—a user-centered approach to developing new solutions to abstract, ill-defined, or complex problems… (More)”

If Data Sharing is the Answer, What is the Question?


Christine L. Borgman at ERCIM News: “Data sharing has become policy enforced by governments, funding agencies, journals, and other stakeholders. Arguments in favor include leveraging investments in research, reducing the need to collect new data, addressing new research questions by reusing or combining extant data, and reproducing research, which would lead to greater accountability, transparency, and less fraud. Arguments against data sharing rarely are expressed in public fora, so popular is the idea. Much of the scholarship on data practices attempts to understand the socio-technical barriers to sharing, with goals to design infrastructures, policies, and cultural interventions that will overcome these barriers.
However, data sharing and reuse are common practice in only a few fields. Astronomy and genomics in the sciences, survey research in the social sciences, and archaeology in the humanities are the typical exemplars, which remain the exceptions rather than the rule. The lack of success of data sharing policies, despite accelerating enforcement over the last decade, indicates the need not just for a much deeper understanding of the roles of data in contemporary science but also for developing new models of scientific practice. Science progressed for centuries without data sharing policies. Why is data sharing deemed so important to scientific progress now? How might scientific practice be different if these policies were in place several generations ago?
Enthusiasm for “big data” and for data sharing are obscuring the complexity of data in scholarship and the challenges for stewardship. Data practices are local, varying from field to field, individual to individual, and country to country. Studying data is a means to observe how rapidly the landscape of scholarly work in the sciences, social sciences, and the humanities is changing. Inside the black box of data is a plethora of research, technology, and policy issues. Data are best understood as representations of observations, objects, or other entities used as evidence of phenomena for the purposes of research or scholarship. Rarely do they stand alone, separable from software, protocols, lab and field conditions, and other context. The lack of agreement on what constitutes data underlies the difficulties in sharing, releasing, or reusing research data.
Concerns for data sharing and open access raise broader questions about what data to keep, what to share, when, how, and with whom. Open data is sometimes viewed simply as releasing data without payment of fees. In research contexts, open data may pose complex issues of licensing, ownership, responsibility, standards, interoperability, and legal harmonization. To scholars, data can be assets, liabilities, or both. Data have utilitarian value as evidence, but they also serve social and symbolic purposes for control, barter, credit, and prestige. Incentives for scientific advancement often run counter to those for sharing data.
….
Rather than assume that data sharing is almost always a “good thing” and that doing so will promote the progress of science, more critical questions should be asked: What are the data? What is the utility of sharing or releasing data, and to whom? Who invests the resources in releasing those data and in making them useful to others? When, how, why, and how often are those data reused? Who benefits from what kinds of data transfer, when, and how? What resources must potential re-users invest in discovering, interpreting, processing, and analyzing data to make them reusable? Which data are most important to release, when, by what criteria, to whom, and why? What investments must be made in knowledge infrastructures, including people, institutions, technologies, and repositories, to sustain access to data that are released? Who will make those investments, and for whose benefit?
Only when these questions are addressed by scientists, scholars, data professionals, librarians, archivists, funding agencies, repositories, publishers, policy makers, and other stakeholders in research will satisfactory answers arise to the problems of data sharing…(More)”.

US government and private sector developing ‘precrime’ system to anticipate cyber-attacks


Martin Anderson at The Stack: “The USA’s Office of the Director of National Intelligence (ODNI) is soliciting the involvement of the private and academic sectors in developing a new ‘precrime’ computer system capable of predicting cyber-incursions before they happen, based on the processing of ‘massive data streams from diverse data sets’ – including social media and possibly deanonymised Bitcoin transactions….
At its core the predictive technologies to be developed in association with the private sector and academia over 3-5 years are charged with the mission ‘to invest in high-risk/high-payoff research that has the potential to provide the U.S. with an overwhelming intelligence advantage over our future adversaries’.
The R&D program is intended to generate completely automated, human-free prediction systems for four categories of event: unauthorised access, Denial of Service (DoS), malicious code and scans and probes which are seeking access to systems.
The CAUSE project is an unclassified program, and participating companies and organisations will not be granted access to NSA intercepts. The scope of the project, in any case, seems focused on the analysis of publicly available Big Data, including web searches, social media exchanges and trawling ungovernable avalanches of information in which clues to future maleficent actions are believed to be discernible.
Program manager Robert Rahmer says: “It is anticipated that teams will be multidisciplinary and might include computer scientists, data scientists, social and behavioral scientists, mathematicians, statisticians, content extraction experts, information theorists, and cyber-security subject matter experts having applied experience with cyber capabilities,”
Battelle, one of the concerns interested in participating in CAUSE, is interested in employing Hadoop and Apache Spark as an approach to the data mountain, and includes in its preliminary proposal an intent to ‘de-anonymize Bitcoin sale/purchase activity to capture communication exchanges more accurately within threat-actor forums…’.
Identifying and categorising quality signal in the ‘white noise’ of Big Data is a central plank in CAUSE, and IARPA maintains several offices to deal with different aspects of it. Its pointedly-named ‘Office for Anticipating Surprise’  frames the CAUSE project best, since it initiated it. The OAS is occupied with ‘Detecting and forecasting the emergence of new technical capabilities’, ‘Early warning of social and economic crises, disease outbreaks, insider threats, and cyber attacks’ and ‘Probabilistic forecasts of major geopolitical trends and rare events’.
Another concerned department is The Office of Incisive Analysis, which is attempting to break down the ‘data static’ problem into manageable mission stages:
1) Large data volumes and varieties – “Providing powerful new sources of information from massive, noisy data that currently overwhelm analysts”
2) Social-Cultural and Linguistic Factors – “Analyzing language and speech to produce insights into groups and organizations. “
3) Improving Analytic Processes – “Dramatic enhancements to the analytic process at the individual and group level. “
The Office of Smart Collection develops ‘new sensor and transmission technologies, with the seeking of ‘Innovative approaches to gain access to denied environments’ as part of its core mission, while the Office of Safe and Secure Operations concerns itself with ‘Revolutionary advances in science and engineering to solve problems intractable with today’s computers’.
The CAUSE program, which attracted 150 developers, organisations, academics and private companies to the initial event, will announce specific figures about funding later in the year, and practice ‘predictions’ from participants will begin in the summer, in an accelerating and stage-managed program over five years….(More)”

Open data could turn Europe’s digital desert into a digital rainforest


Joanna Roberts interviews Dirk Helbing, Professor of Computational Social Science at ETH Zurich at Horizon: “…If we want to be competitive, Europe needs to find its own way. How can we differentiate ourselves and make things better? I believe Europe should not engage in the locked data strategy that we see in all these huge IT giants. Instead, Europe should engage in open data, open innovation, and value-sensitive design, particularly approaches that support informational self-determination. So everyone can use this data, generate new kinds of data, and build applications on top. This is going to create ever more possibilities for everyone else, so in a sense that will turn a digital desert into a digital rainforest full of opportunities for everyone, with a rich information ecosystem.’…
The Internet of Things is the next big emerging information communication technology. It’s based on sensors. In smartphones there are about 15 sensors; for light, for noise, for location, for all sorts of things. You could also buy additional external sensors for humidity, for chemical substances and almost anything that comes to your mind. So basically this allows us to measure the environment and all the features of our physical, biological, economic, social and technological environment.
‘Imagine if there was one company in the world controlling all the sensors and collecting all the information. I think that might potentially be a dystopian surveillance nightmare, because you couldn’t take a single step or speak a single word without it being recorded. Therefore, if we want the Internet of Things to be consistent with a stable democracy then I believe we need to run it as a citizen web, which means to create and manage the planetary nervous system together. The citizens themselves would buy the sensors and activate them or not, would decide themselves what sensor data they would share with whom and for what purpose, so informational self-determination would be at the heart, and everyone would be in control of their own data.’….
A lot of exciting things will become possible. We would have a real-time picture of the world and we could use this data to be more aware of what the implications of our decisions and actions are. We could avoid mistakes and discover opportunities we would otherwise have missed. We will also be able to measure what’s going on in our society and economy and why. In this way, we will eventually identify the hidden forces that determine the success or failure of a company, of our economy or even our society….(More)”

The crowd-sourcing web project bringing amateur and professional archaeologists together


Sarah Jackson at Culture 24: “With only limited funds and time, professional archaeologists consistently struggle to protect and interpret the UK’s vast array of archaeological finds and resources. Yet there are huge pools of amateur groups and volunteers who are not only passionate but also skilled and knowledgeable about archaeology in the UK.
Now a new web platform called MicroPasts has been produced in a collaboration between University College London (UCL) and the British Museum to connect institutions and volunteers so that they can create, fund and work on archaeological projects together.
Work by UCL postdoctoral researchers Chiara Bonacchi and Adi Keinan-Schoonbaert and British Museum post doc researcher Jennifer Wexler established much of the ground work including the design, implementation and the public engagement aspects of the of the new site.
According to one of the project leaders, Professor Andrew Bevan at UCL, MicroPasts emerged from a desire to harness the expertise (and manpower) of volunteers and to “pull together crowd-sourcing and crowd-funding in a way that hadn’t been tried before”.
Although there are many crowd-sourcing portals online, they are either specific to one project (DigVentures, for example, which conducted the world’s first crowd-funded dig in 2012) or can be used to create almost anything you can imagine (such as Kickstarter).
MicroPasts was also inspired by Crowdcrafting, which encourages citizen science projects and, like MicroPasts, offers a platform for volunteers and researchers with an interest in a particular subject to come together to create and contribute to projects….(More)”

Measuring government impact in a social media world


Arthur Mickoleit & Ryan Androsoff at OECD Insights: “There is hardly a government around the world that has not yet felt the impact of social media on how it communicates and engages with citizens. And while the most prominent early adopters in the public sector have tended to be politicians (think of US President Barack Obama’s impressive use of social media during his 2008 campaign), government offices are also increasingly jumping on the bandwagon. Yes, we are talking about those – mostly bricks-and-mortar – institutions that often toil away from the public gaze, managing the public administration in our countries. As the world changes, they too are increasingly engaging in a very public way through social media.
Research from our recent OECD working paper “Social Media Use by Governments” shows that as of November 2014, out of 34 OECD countries, 28 have a Twitter account for the office representing the top executive institution (head of state, head of government, or government as a whole), and 21 have a Facebook account….
 
But what is the impact governments can or should expect from social media? Is it all just vanity and peer pressure? Surely not.
Take the Spanish national police force (e.g. on Twitter, Facebook & YouTube), a great example of using social media to build long-term engagement, trust and a better public service. The thing so many governments yearn for, in this case the Spanish police seem to have managed well.
Or take the Danish “tax daddy” on Twitter – @Skattefar. It started out as the national tax administration’s quest to make it easier for everyone to submit correct tax filings; it is now one of the best examples around of a tax agency gone social.
Government administrations can use social media for internal purposes too. The Government of Canada used public platforms like Twitter and internal platforms like GCpedia and GCconnex to conduct a major employee engagement exercise (Blueprint 2020) to develop a vision for the future of the Canadian federal public service.
And when it comes to raising efficiency in the public sector, read this account of a Dutch research facility’s Director who decided to stop email. Not reduce it, but stop it altogether and replace it with social media.
There are so many other examples that could be cited. But the major question is how can we even begin to appraise the impact of these different initiatives? Because as we’ve known since the 19th century, “if you cannot measure it, you cannot improve it” (quote usually attributed to Lord Kelvin). Some aspects of impact measurement for social media can be borrowed from the private sector with regards to presence, popularity, penetration, and perception. But it’s around purpose that impact measurement agendas will split between the private sector and government. Virtually all companies will want to calculate the return on social media investments based on whether it helps them improve their financial returns. That’s different in the public sector where purpose is rarely defined in commercial terms.
A good impact assessment for social media in the public sector therefore needs to be built around its unique purpose-orientation. This is much more difficult to measure and it will involve a mix of quantitative data (e.g. reach of target audience) and qualitative data (e.g. case studies describing tangible impact). Social Media Use by Governments proposes a framework to start looking at social media measurement in gradual steps – from measuring presence, to popularity, to penetration, to perception, and finally, to purpose-orientation. The aim of this framework is to help governments develop truly relevant metrics and start treating social media activity by governments with the same public management rigour that is applied to other government activities. You can see a table summarising the framework by clicking on the thumbnail below.
This is far from an exact science, but we are beginning the work collaborating with member and partner governments to develop a toolkit that will help decision-makers implement the OECD Recommendation on Digital Government Strategies, including on the issue of social media metrics…(More)”.

Governance in the Information Era


New book edited by Erik W. Johnston:” Policy informatics is addressing governance challenges and their consequences, which span the seeming inability of governments to solve complex problems and the disaffection of people from their governments. Policy informatics seeks approaches that enable our governance systems to address increasingly complex challenges and to meet the rising expectations of people to be full participants in their communities. This book approaches these challenges by applying a combination of the latest American and European approaches in applying complex systems modeling, crowdsourcing, participatory platforms and citizen science to explore complex governance challenges in domains that include education, environment, and health.(More)

We Need To Innovate The Science Business Model


Greg Satell at Forbes: “In 1945, Vannevar Bush, the man that led the nation’s scientific efforts during World War II, delivered a proposal to President Truman for funding scientific research in the post-war world.  Titled Science, The Endless Frontier, it led to the formation of the NSFNIHDARPA and other agencies….
One assumption inherent in Bush’s proposal was that institutions would be at the center of scientific life.  Scientists from disparate labs could read each others papers and meet at an occasional conference, but for the most part, they would be dependent on the network of researchers within their organization and those close by.
Sometimes, the interplay between institutions had major, even historical, impacts, such as John von Neumann’s sponsorship of Alan Turing, but mostly the work you did was largely a function of where you did it.  The proximity of Watson, Crick, Rosalind Franklin and Maurice Wilkins, for example, played a major role in the discovery of the structure of DNA.
Yet today, digital technology is changing not only the speed and ease of how we communicate, but the very nature of how we are able to collaborate.  When I spoke to Jonathan Adams, Chief Scientist at Digital Science, which develops and invests in software that makes science more efficient, he noted that there is a generational shift underway and said this:

When you talk to people like me, we’re established scientists who are still stuck in the old system of institutions and conferences.  But the younger scientists are using technology to access networks and they do so on an ongoing, rather than a punctuated basis.  Today, you don’t have to go to a conference or write a paper to exchange ideas.

Evidence would seem to bear this out.  The prestigious journal Nature recently noted that the average scientific paper has four times as many authors as it did in the 1950’s, when Bush’s career was at its height.  Moreover, it’s become common for co-authors to work at far-flung institutions.  Scientific practice needs to adopt to this scientific reality.
There has been some progress in this area.  The Internet, in fact, was created for the the explicit purpose of scientific collaboration.  Yet still, the way in which scientists report and share their findings remains much the same as a century ago.
Moving From Publications To Platforms For Discovery
One especially ripe area for innovation is publishing.  Typically, a researcher with a new discovery waits six months to a year for the peer review process to run its course before the work can be published.  Even then, many of the results are questionable at best.  Nature recently reported that the overwhelming majority of studies can’t be replicated…(More)”