Tech challenge develops algorithms to predict


SciDevNet: “Mathematical models that use existing socio-political data to predict mass atrocities could soon inform governments and NGOs on how and where to take preventative action.
The models emerged from one strand of the Tech Challenge for Atrocity Prevention, a competition run by the US Agency for International Development (USAID) and NGO Humanity United. The winners were announced last month (18 November) and will now work with the organiser to further develop and pilot their innovations.
The five winners from different countries who won between US$1,000 and US$12,000, were among nearly 100 entrants who developed algorithms to predict when and where mass atrocities are likely to happen.
Around 1.5 billion people live in countries affected by conflict, sometimes including atrocities such as genocides, mass rape and ethnic cleansing, according to the World Bank’s World Development Report 2011. Many of these countries are in the developing world.
The competition organisers hope the new algorithms could help governments and human rights organisations identify at-risk regions, potentially allowing them to intervene before mass atrocities happen.
The competition started from the premise that certain social and political measurements are linked to increased likelihood of atrocities. Yet because such factors interact in complex ways, organisations working to prevent atrocities lack a reliable method of predicting when and where they might happen next.
The algorithms use sociopolitical indicators and data on past atrocities as their inputs. The data was drawn from archives such as the Global Database of Events, Language and Tone, a data set that encodes more than 200 million globally newsworthy events, recording cultural information such as the people involved, their location and any religious connections.”
Link to the winners of the Model Challenge

Participation Dynamics in Crowd-Based Knowledge Production: The Scope and Sustainability of Interest-Based Motivation


New paper by Henry Sauermann and Chiara Franzoni: “Crowd-based knowledge production is attracting growing attention from scholars and practitioners. One key premise is that participants who have an intrinsic “interest” in a topic or activity are willing to expend effort at lower pay than in traditional employment relationships. However, it is not clear how strong and sustainable interest is as a source of motivation. We draw on research in psychology to discuss important static and dynamic features of interest and derive a number of research questions regarding interest-based effort in crowd-based projects. Among others, we consider the specific versus general nature of interest, highlight the potential role of matching between projects and individuals, and distinguish the intensity of interest at a point in time from the development and sustainability of interest over time. We then examine users’ participation patterns within and across 7 different crowd science projects that are hosted on a shared platform. Our results provide novel insights into contribution dynamics in crowd science projects. Moreover, given that extrinsic incentives such as pay, status, self-use, or career benefits are largely absent in these particular projects, the data also provide unique insights into the dynamics of interest-based motivation and into its potential as a driver of effort.”

Building tech-powered public services


New publication by Sarah Bickerstaffe from IPPR (UK): “Given the rapid pace of technological change and take-up by the public, it is a question of when not if public services become ‘tech-powered’. This new paper asks how we can ensure that innovations are successfully introduced and deployed.
Can technology improve the experience of people using public services, or does it simply mean job losses and a depersonalised offer to users?
Could tech-powered public services be an affordable, sustainable solution to some of the challenges of these times of austerity?
This report looks at 20 case studies of digital innovation in public services, using these examples to explore the impact of new and disruptive technologies. It considers how tech-powered public services can be delivered, focusing on the area of health and social care in particular.
We identify three key benefits of increasing the role of technology in public services: saving time, boosting user participation, and encouraging users to take responsibility for their own wellbeing.
In terms of how to successfully implement technological innovations in public services, five particular lessons stood out clearly and consistently:

  1. User-based iterative design is critical to delivering a product that solves real-world problems. It builds trust and ensures the technology works in the context in which it will be used.
  2. Public sector expertise is essential in order for a project to make the connections necessary to initial development and early funding.
  3. Access to seed and bridge funding is necessary to get projects off the ground and allow them to scale up.
  4. Strong leadership from within the public sector is crucial to overcoming the resistance that practitioners and managers often show initially.
  5. A strong business case that sets out the quality improvements and cost savings that the innovation can deliver is important to get attention and interest from public services.

The seven headline case studies in this report are:

  • Patchwork creates an elegant solution to join up professionals working with troubled families, in an effort to ensure that frontline support is truly coordinated.
  • Casserole Club links people who like cooking with their neighbours who are in need of a hot meal, employing the simplest possible technology to grow social connections.
  • ADL Smartcare uses a facilitated assessment tool to make professional expertise accessible to staff and service users without years of training, meaning they can carry out assessments together, engaging people in their own care and freeing up occupational therapists to focus where they are needed.
  • Mental Elf makes leading research in mental health freely available via social media, providing accessible summaries to practitioners and patients who would not otherwise have the time or ability to read journal articles, which are often hidden behind a paywall.
  • Patient Opinion provides an online platform for people to give feedback on the care they have received and for healthcare professionals and providers to respond, disrupting the typical complaints process and empowering patients and their families.
  • The Digital Pen and form system has saved the pilot hospital trust three minutes per patient by avoiding the need for manual data entry, freeing up clinical and administrative staff for other tasks.
  • Woodland Wiggle allows children in hospital to enter a magical woodland world through a giant TV screen, where they can have fun, socialise, and do their physiotherapy.”

Google Global Impact Award Expands Zooniverse


Press Release: “A $1.8 million Google Global Impact Award will enable Zooniverse, a nonprofit collaboration led by the Adler Planetarium and the University of Oxford, to make setting up a citizen science project as easy as starting a blog and could lead to thousands of innovative new projects around the world, accelerating the pace of scientific research.
The award supports the further development of the Zooniverse, the world’s leading ‘citizen science’ platform, which has already given more than 900,000 online volunteers the chance to contribute to science by taking part in activities including discovering planets, classifying plankton or searching through old ship’s logs for observations of interest to climate scientists. As part of the Global Impact Award, the Adler will receive $400,000 to support the Zooniverse platform.
With the Google Global Impact Award, Zooniverse will be able to rebuild their platform so that research groups with no web development expertise can build and launch their own citizen science projects.
“We are entering a new era of citizen science – this effort will enable prolific development of science projects in which hundreds of thousands of additional volunteers will be able to work alongside professional scientists to conduct important research – the potential for discovery is limitless,” said Michelle B. Larson, Ph.D., Adler Planetarium president and CEO. “The Adler is honored to join its fellow Zooniverse partner, the University of Oxford, as a Google Global Impact Award recipient.”
The Zooniverse – the world’s leading citizen science platform – is a global collaboration across several institutions that design and build citizen science projects. The Adler is a founding partner of the Zooniverse, which has already engaged more than 900,000 online volunteers as active scientists by discovering planets, mapping the surface of Mars and detecting solar flares. Adler-directed citizen science projects include: Galaxy Zoo (astronomy), Solar Stormwatch (solar physics), Moon Zoo (planetary science), Planet Hunters (exoplanets) and The Milky Way Project (star formation). The Zooniverse (zooniverse.org) also includes projects in environmental, biological and medical sciences. Google’s investment in the Adler and its Zooniverse partner, the University of Oxford, will further the global reach, making thousands of new projects possible.”

Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

Data isn't a four-letter word


Speech by Neelie Kroes, Vice-President of the European Commission responsible for the Digital Agenda: “I want to talk about data too: the opportunity as well as the threat.
Making data the engine of the European economy: safeguarding fundamental rights capturing the data boost, and strengthening our defences.
Data is at a cross-roads. We have opportunities; open data, big data, datamining, cloud computing. Tim Berners Lee, creator of the world wide web, saw the massive potential of open data. As he put it, if you put that data online, it will be used by other people to do wonderful things, in ways that you could never imagine.
On the other hand, we have threats: to our privacy and our values, and to the openness that makes it possible to innovate, trade and exchange.
Get it right and we can safeguard a better economic future. Get it wrong, and we cut competitiveness without protecting privacy. So we remain dependent on the digital developments of others: and just as vulnerable to them.
How do we find that balance? Not with hysteria; nor by paralysis. Not by stopping the wonderful things, simply to prevent the not-so-wonderful. Not by seeing data as a dirty word.
We are seeing a whole economy develop around data and cloud computing. Businesses using them, whole industries depending on them, data volumes are increasing exponentially. Data is not just an economic sideshow, it is a whole new asset class; requiring new skills and creating new jobs.
And with a huge range of applications. From decoding human genes to predicting the traffic, and even the economy. Whatever you’re doing these days, chances are you’re using big data (like translation, search, apps, etc).
There is increasing recognition of the data boost on offer. For example, open data can make public administrations more transparent and stimulate a rich innovative market. That is what the G8 Leaders recognised in June, with their Open Data Charter. For scientists too, open data and open access offer new ways to research and progress.
That is a philosophy the Commission has shared for some time. And that is what our ‘Open Data’ package of December 2011 is all about. With new EU laws to open up public administrations, and a new EU Open Data Portal. And all EU-funded scientific publications available under open access.
Now not just the G8 and the Commission are seeing this data opportunity: but the European Council too. Last October, they recognised the potential of big data innovation, the need for a single market in cloud computing; and the urgency of Europe capitalising on both.
We will be acting on that. Next spring, I plan a strategic agenda for research on data. Working with private partners and national research funders to shape that agenda, and get the most bang for our research euro.
And, beyond research, there is much we can do to align our work and support secure big data. From training skilled workers, to modernising copyright for data and text mining, to different actors in the value chain working together: for example through a public-private partnership.
…Empowering people is not always easy in this complex online world. I want to see technical solutions emerge that can do that, give users control over their desired level of privacy, how their data will be used, and making it easier to verify online rights are respected.
How can we do that? How can we ensure systems that are empowering, transparent, and secure? There are a number of subtleties in play. Here’s my take.
First, companies engaged in big data will need to start thinking about privacy protection at every stage: and from system development, to procedures and practices.
This is the principle of “privacy by design”, set out clearly in the proposed Data Protection Regulation. In other words, from now on new business ideas have two purposes: delivering a service and protecting privacy at the right level.
Second, also under the regulation, big data applications that might put fundamental rights at risk would require the company to carry out a “Privacy Impact Assessment”. This is another good way to combine innovation and privacy: ensuring you think about any risks from the start.
Third, sometimes, particularly for personal data, a company might realise they need user consent. Consent is a cornerstone of data protection rules, and should stay that way.
But we need to get smart, and apply common sense to consent. Users can’t be expected to know everything. Nor asked to consent to what they cannot realistically understand. Nor presented with false dilemmas, a black-and-white choice between consenting or getting shut out of services.
Fourth, we can also get smart when it comes to anonymisation. Sometimes, full anonymisation means losing important information, so you can no longer make the links between data. That could make the difference between progress or paralysis. But using pseudonyms can let you to analyse large amounts of data: to spot, for example, that people with genetic pattern X also respond well to therapy Y.
So it is understandable why the European Parliament has proposed a more flexible data protection regime for this type of data. Companies would be able to process the data on grounds of legitimate interest, rather than consent. That could make all the positive difference to big data: without endangering privacy.
Of course, in those cases, companies still to minimise privacy risks. Their internal processes and risk assessments must show how they comply with the guiding principles of data protection law. And – if something does go wrong – the company remains accountable.
Indeed company accountability is another key element of our proposal. And here again we welcome the European Parliament’s efforts to reinforce that. Clearly, you might assure accountability in different ways for different companies. But standards for compliance and processes could make a real difference.
A single data protection law for Europe would be a big step forward. National fortresses and single market barriers just make it harder for Europe to lead in digital, harder for Europe to become the natural home of secure online services. Data protection cannot mean data protectionism. Rather, it means safeguarding privacy does not come at the expense of innovation: with laws both flexible and future proof, pragmatic and proportionate, for a changing world….
But data protection rules are really just the start. They are only part of our response to the Snowden revelations….”

Phone Apps Help Government, Others Counter Violence Against Women


NextGov: “Smart and mobile phones have helped authorities solve crimes from beatings that occurred during the London riots to the Boston Marathon bombing. A panel of experts gathered on Monday said the devices can also help reduce and combat rapes and other gender-based violence.
Smartphone apps and text messaging services proliferated in India following a sharp rise in reported gang rapes, including the brutal 2012 rape and murder of a 23-year-old medical student in Delhi, according to panelists at the Wilson Center event on gender-based violence and innovative technologies.
The apps fall into four main categories, said Alex Dehgan, chief data scientist at the United States Agency for International Development: apps that aid sexual assault and domestic violence victims, apps that empower women to fight back against gender-based violence, apps focused on advocacy and apps that crowdsource and map cases of sexual assault.
The final category of apps is largely built on the Ushahidi platform, which was developed to track reports of missing people following the 2010 Haiti earthquake.
One of the apps, Safecity, offers real-time alerts about sexual assaults across India to help women identify unsafe areas.
Similar apps have been launched in Egypt and Syria, Dehgan said. In lower-tech countries the systems often operate using text messages rather than smartphone apps so they’re more widely accessible.
One of the greatest impediments to using mobile technology to reduce gender violence is third world nations in which women often don’t have access to their own mobile or smartphones and rural areas in the U.S. and abroad in which there is limited service or broadband, Christopher Burns, USAID’s team leader for mobile access, said.
Burns suggested international policymakers should align plans for expanding broadband and mobile service with crowdsourced reports of gender violence.
“One suggestion for policy makers to focus on is to take a look at the crowd maps we’ve talked about today and see where there are greater incidences of gender-based violence and violence against women,” he said. “In all likelihood, those pockets probably don’t have the connectivity, don’t have the infrastructure [and] don’t have the capacity in place for survivors to benefit from those tools.”
One tool that’s been used in the U.S. is Circle of 6, an app for women on college campuses to automatically draw on friends when they think they’re in danger. The app allows women to pick six friends they can automatically text if they think they’re in a dangerous situation, asking them to call with an excuse for them to leave.
The app is designed to look like a game so it isn’t clear women are using their phones to seek help, said Nancy Schwartzman, executive director of Tech 4 Good, which developed the app.
Schwartzman has heard reports of gay men on college campuses using the app as well, she said. The military has been in contact with Tech 4 Good about developing a version of the app to combat sexual assault on military bases, she said.”

The Decay of American Political Institutions


in the American Interest: “Many political institutions in the United States are decaying. This is not the same thing as the broader phenomenon of societal or civilization decline, which has become a highly politicized topic in the discourse about America. Political decay in this instance simply means that a specific political process—sometimes an individual government agency—has become dysfunctional. This is the result of intellectual rigidity and the growing power of entrenched political actors that prevent reform and rebalancing. This doesn’t mean that America is set on a permanent course of decline, or that its power relative to other countries will necessarily diminish. Institutional reform is, however, an extremely difficult thing to bring about, and there is no guarantee that it can be accomplished without a major disruption of the political order. So while decay is not the same as decline, neither are the two discussions unrelated.
There are many diagnoses of America’s current woes. In my view, there is no single “silver bullet” cause of institutional decay, or of the more expansive notion of decline. In general, however, the historical context of American political development is all too often given short shrift in much analysis. If we look more closely at American history as compared to that of other liberal democracies, we notice three key structural characteristics of American political culture that, however they developed and however effective they have been in the past, have become problematic in the present.
The first is that, relative to other liberal democracies, the judiciary and the legislature (including the roles played by the two major political parties) continue to play outsized roles in American government at the expense of Executive Branch bureaucracies. Americans’ traditional distrust of government thus leads to judicial solutions for administrative problems. Over time this has become a very expensive and inefficient way to manage administrative requirements.
The second is that the accretion of interest group and lobbying influences has distorted democratic processes and eroded the ability of the government to operate effectively. What biologists label kin selection and reciprocal altruism (the favoring of family and friends with whom one has exchanged favors) are the two natural modes of human sociability. It is to these types of relationships that people revert when modern, impersonal government breaks down.
The third is that under conditions of ideological polarization in a federal governance structure, the American system of checks and balances, originally designed to prevent the emergence of too strong an executive authority, has become a vetocracy. The decision system has become too porous—too democratic—for its own good, giving too many actors the means to stifle adjustments in public policy. We need stronger mechanisms to force collective decisions but, because of the judicialization of government and the outsized role of interest groups, we are unlikely to acquire such mechanisms short of a systemic crisis. In that sense these three structural characteristics have become intertwined….
In short, the problems of American government flow from a structural imbalance between the strength and competence of the state, on the one hand, and the institutions that were originally designed to constrain the state, on the other. There is too much law and too much “democracy”, in the form of legislative intervention, relative to American state capacity. Some history can make this assertion clearer….
In well-functioning governance systems, moreover, a great deal of deliberation occurs not just in legislatures but within bureaucracies. This is not a matter of bureaucrats simply talking to one another, but rather a complex series of consultations between government officials and businesses, outside implementers and service providers, civil society groups, the media and other sources of information about societal interests and opinions. The Congress wisely mandated consultation in the landmark 1946 Administrative Procedures Act, which requires regulatory agencies to publicly post proposed rule changes and to solicit comment about them. But these consultative procedures have become highly routinized and pro forma, with actual decisions being the outcome not of genuine deliberation, but of political confrontations between well organized interest groups….”

The United States Releases its Second Open Government National Action Plan


Nick Sinai and Gayle Smith at the White House: “Since his first full day in office, President Obama has prioritized making government more open and accountable and has taken substantial steps to increase citizen participation, collaboration, and transparency in government. Today, the Obama Administration released the second U.S. Open Government National Action Plan, announcing 23 new or expanded open-government commitments that will advance these efforts even further.
…, in September 2011, the United States released its first Open Government National Action Plan, setting a series of ambitious goals to create a more open government. The United States has continued to implement and improve upon the open-government commitments set forth in the first Plan, along with many more efforts underway across government, including implementing individual Federal agency Open Government Plans. The second Plan builds on these efforts, in part through a series of key commitments highlighted in a preview report issued by the White House in October 2013, in conjunction with the Open Government Partnership Annual Summit in London.
Among the highlights of the second National Action Plan:

  • “We the People”: The White House will introduce new improvements to the We the People online petitions platform aimed at making it easier to collect and submit signatures and increase public participation in using this platform. Improvements will enable the public to perform data analysis on the signatures and petitions submitted to We the People, as well as include a more streamlined process for signing petitions and a new Application Programming Interface (API) that will allow third-parties to collect and submit signatures from their own websites.
  • Freedom of Information Act (FOIA) Modernization: The FOIA encourages accountability through transparency and represents an unwavering national commitment to open government principles. Improving FOIA administration is one of the most effective ways to make the U.S. Government more open and accountable. Today, we announced five commitments to further modernize FOIA processes, including launching a consolidated online FOIA service to improve customers’ experience, creating and making training resources available to FOIA professionals and other Federal employees, and developing common FOIA standards for agencies across government.
  • The Global Initiative on Fiscal Transparency (GIFT): The United States will join GIFT, an international network of governments and non-government organizations aimed at enhancing financial transparency, accountability, and stakeholder engagement. The U.S. Government will actively participate in the GIFT Working Group and seek opportunities to collaborate with stakeholders and champion greater fiscal openness and transparency in domestic and global spending.
  • Open Data to the Public: Over the past few years, government data has been used by journalists to uncover variations in hospital billings, by citizens to learn more about the social services provided by charities in their communities, and by entrepreneurs building new software tools to help farmers plan and manage their crops.  Building on the U.S. Government’s ongoing open data efforts, new commitments will make government data even more accessible and useful for the public, including by reforming how Federal agencies manage government data as a strategic asset, launching a new version of Data.gov to make it even easier to discover, understand, and use open government data, and expanding access to agriculture and nutrition data to help farmers and communities.
  • Participatory Budgeting: The United States will promote community-led participatory budgeting as a tool for enabling citizens to play a role in identifying, discussing, and prioritizing certain local public spending projects, and for giving citizens a voice in how taxpayer dollars are spent in their communities. This commitment will include steps by the U.S. Government to help raise awareness of the fact that participatory budgeting may be used for certain eligible Federal community development grant programs.

Other initiatives launched or expanded today include: increasing open innovation by encouraging expanded use of challenges, incentive prizes, citizen science, and crowdsourcing to harness American ingenuity, as well as modernizing the management of government records by leveraging technology to make records less burdensome to manage and easier to use and share. There are many other exciting open-government initiatives described in the second Plan — and you can view them all here.”

Index: Measuring Impact with Evidence


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on measuring impact with evidence and was originally published in 2013.

United States

  • Amount per $100 of government spending that is backed by evidence that the money is being spent wisely: less than $1
  • Number of healthcare treatments delivered in the U.S. that lack evidence of effectiveness: more than half
  • How much of total U.S. healthcare expenditure is spent to determine what works: less than 0.1 percent
  • Number of major U.S. federal social programs evaluated since 1990 using randomized experiments and found to have “weak or no positive effects”: 9 out of 10
  • Year the Coalition for Evidence-Based Policy was set up to work with federal policymakers to advance evidence-based reforms in major U.S. social programs: 2001
  • Year the Program Assessment Rating Tool (PART) was introduced by President Bush’s Office of Management and Budget (OMB): 2002
    • Out of about 1,000 programs assessed, number found to be effective in 2008: 19%
    • Percentage of programs that could not be assessed due to insufficient data: 17%
    • Amount spent on the Even Start Family Literacy Program, rated ineffective by PART, over the life of the Bush administration: more than $1 billion
  •  Year Washington State legislature began using Washington State Institute for Public Policy’s estimates on how “a portfolio of evidence-based and economically sound programs . . . could affect the state’s crime rate, the need to build more prisons, and total criminal-justice spending”: 2007
    • Amount invested by legislature in these programs: $48 million
    • Amount saved by the legislature: $250 million
  • Number of U.S. States in a pilot group working to adapt The Pew-MacArthur Results First Initiative, based on the Washington State model, to make performance-based policy decisions: 14
  • Net savings in health care expenditure by using the Transitional Care Model, which meets the Congressionally-based Top Tier Evidence Standard: $4,000 per patient
  • Number of states that conducted “at least some studies that evaluated multiple program or policy options for making smarter investments of public dollars” between 2008-2011: 29
  • Number of states that reported that their cost-benefit analysis influenced policy decisions or debate: 36
  • Date the Office of Management and Budget issued a memorandum proposing new evaluations and advising agencies to include details on determining effectiveness of their programs, link disbursement to evidence, and support evidence-based initiatives: 2007
  • Percentage increase in resources for innovation funds that use a tiered model for evidence, according to the President’s FY14 budget: 44% increase
  • Amount President Obama proposed in his FY 2013 budget to allocate in existing funding to Performance Partnerships “in which states and localities would be given the flexibility to propose better ways to combine federal resources in exchange for greater accountability for results”:  $200 million
  • Amount of U.S. federal program funding that Harvard economist Jeffrey Liebman suggests be directed towards evaluations of outcomes: 1%
  • Amount of funding the City of New York has committed for evidence-based research and development initiatives through its Center for Economic Opportunity: $100 million a year

Internationally

  • How many of the 30 OECD countries in 2005-6 have a formal requirement by law that the benefits of regulation justify the costs: half
    • Number of 30 OECD member countries in 2008 that reported quantifying benefits to regulations: 16
    • Those who reported quantifying costs: 24
  • How many members make up the Alliance for Useful Evidence, a network that “champion[s]  evidence, the opening up of government data for interrogation and use, alongside the sophistication in research methods and their applications”: over 1,000
  • Date the UK government, the ESRC and the Big Lottery Fund announced plans to create a network of ‘What Works’ evidence centres: March 2013
  • Core funding for the What Works Centre for Local Economic Growth: £1m p.a. over an initial three year term
  • How many SOLACE Summit members in 2012 were “very satisfied” with how Research and Intelligence resources support evidence-based decision-making: 4%
    • Number of areas they identified for improving evidence-based decision-making: 5
    • Evaluation of the impact of past decisions: 46% of respondents
    • Benchmarking data with other areas: 39%
    • assessment of options available: 33% 
    • how evidence is presented: 29% 
    • Feedback on public engagement and consultation: 25%
  •  Number of areas for improvement for Research and Intelligence staff development identified at the SOLACE Summit: 6
    • Strengthening customer insight and data analysis: 49%
    • Impact evaluation: 48%
    • Strategic/corporate thinking/awareness: 48%
    • Political acumen: 46%
    • Raising profile/reputation of the council for evidence-based decisions: 37%
    • Guidance/mentoring on use of research for other officers: 25%

Sources