Designing a Citizen Science and Crowdsourcing Toolkit for the Federal Government


Jenn Gustetic, Lea Shanley, Jay Benforado, and Arianne Miller at the White House Blog: “In the 2013 Second Open Government National Action Plan, President Obama called on Federal agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods, such as citizen science and crowdsourcing, to help address a wide range of scientific and societal problems.
Citizen science is a form of open collaboration in which members of the public participate in the scientific process, including identifying research questions, collecting and analyzing data, interpreting results, and solving problems. Crowdsourcing is a process in which individuals or organizations submit an open call for voluntary contributions from a large group of unknown individuals (“the crowd”) or, in some cases, a bounded group of trusted individuals or experts.
Citizen science and crowdsourcing are powerful tools that can help Federal agencies:

  • Advance and accelerate scientific research through group discovery and co-creation of knowledge. For instance, engaging the public in data collection can provide information at resolutions that would be difficult for Federal agencies to obtain due to time, geographic, or resource constraints.
  • Increase science literacy and provide students with skills needed to excel in science, technology, engineering, and math (STEM). Volunteers in citizen science or crowdsourcing projects gain hands-on experience doing real science, and take that learning outside of the classroom setting.
  • Improve delivery of government services with significantly lower resource investments.
  • Connect citizens to the missions of Federal agencies by promoting a spirit of open government and volunteerism.

To enable effective and appropriate use of these new approaches, the Open Government National Action Plan specifically commits the Federal government to “convene an interagency group to develop an Open Innovation Toolkit for Federal agencies that will include best practices, training, policies, and guidance on authorities related to open innovation, including approaches such as incentive prizes, crowdsourcing, and citizen science.”
On November 21, 2014, the Office of Science and Technology Policy (OSTP) kicked off development of the Toolkit with a human-centered design workshop. Human-centered design is a multi-stage process that requires product designers to engage with different stakeholders in creating, iteratively testing, and refining their product designs. The workshop was planned and executed in partnership with the Office of Personnel Management’s human-centered design practice known as “The Lab” and the Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS), a growing network of more than 100 employees from more than 20 Federal agencies….
The Toolkit will help further the culture of innovation, learning, sharing, and doing in the Federal citizen science and crowdsourcing community: indeed, the development of the Toolkit is a collaborative and community-building activity in and of itself.
The following successful Federal projects illustrate the variety of possible citizen science and crowdsourcing applications:

  • The Citizen Archivist Dashboard (NARA) coordinates crowdsourced archival record tagging and document transcription. Recently, more than 170,000 volunteers indexed 132 million names of the 1940 Census in only five months, which NARA could not have done alone.
  • Through Measuring Broadband America (FCC), 2 million volunteers collected and provided the FCC with data on their Internet speeds, data that FCC used to create a National Broadband Map revealing digital divides.
  • In 2014, Nature’s Notebook (USGS, NSF) volunteers recorded more than 1 million observations on plants and animals that scientists use to analyze environmental change.
  • Did You Feel It? (USGS) has enabled more than 3 million people worldwide to share their experiences during and immediately after earthquakes. This information facilitates rapid damage assessments and scientific research, particularly in areas without dense sensor networks.
  • The mPING (NOAA) mobile app has collected more than 600,000 ground-based observations that help verify weather models.
  • USAID anonymized and opened its loan guarantee data to volunteer mappers. Volunteers mapped 10,000 data points in only 16 hours, compared to the 60 hours officials expected.
  • The Air Sensor Toolbox (EPA), together with training workshops, scientific partners, technology evaluations, and a scientific instrumentation loan program, empowers communities to monitor and report local air pollution.

In early 2015, OSTP, in partnership with the Challenges and Prizes Community of Practice, will convene Federal practitioners to develop the other half of the Open Innovation Toolkit for prizes and challenges. Stay tuned!”
 

Mapping information economy business with big data: findings from the UK


NESTA: “This paper uses innovative ‘big data’ resources to measure the size of the information economy in the UK.

Key Findings

  • Counts of information economy firms are 42 per cent larger than SIC-based estimates
  • Using ‘big data’ estimates, the research finds 225,800 information economy businesses in the UK
  • Information economy businesses are highly clustered across the country, with very high counts in the Greater South East, notably London (especially central and east London), as well as big cities such as Manchester, Birmingham and Bristol
  • Looking at local clusters, we find hotspots in Middlesbrough, Aberdeen, Brighton, Cambridge and Coventry, among others

Information and Communications Technologies – and the digital economy they support – are of enduring interest to researchers and policymakers. National and local government are particularly keen to understand the characteristics and growth potential of ‘their’ digital businesses.
Given the recent resurgence of interest in industrial policy across many developed countries, there is now substantial policy interest in developing stronger, more competitive digital economies. For example, the UK’s current industrial strategy combines horizontal interventions with support for seven key sectors, of which the ‘information economy’ is one.
The desire to grow high–tech clusters is often prominent in the policy mix – for instance, the UK’s Tech City UK initiative, Regional Innovation Clusters in the US and elements of ‘smart specialisation’ policies in the EU.
In this paper, NIESR and Growth Intelligence use novel ‘big data’ sources to improve our understanding of information economy businesses in the UK – that is, those involved in the production of ICTs. We use this experience to critically reflect on some of the opportunities and challenges presented by big data tools and analytics for economic research and policymaking.”
– See more at: http://www.nesta.org.uk/publications/mapping-information-economy-business-big-data-findings-uk-0#sthash.2ismEMr2.dpuf

Restoring Confidence in Open, Shared and Personal Data


Report of the UK Digital Government Review: “It is obvious that government needs to be able to use data both to deliver services and to present information to public view. How else would government know which bank account to place a pension payment into, or a citizen know the results of an election or how to contact their elected representatives?

As more and more data is created, preserved and shared in ever-increasing volumes a number of urgent questions are begged: over opportunities and hazards; over the importance of using best-practice techniques, insights and technologies developed in the private sector, academia and elsewhere; over the promises and limitations of openness; and how all this might be articulated and made accessible to the public.

Government has already adopted “open data” (we will discuss this more in the next section) and there are now increasing calls for government to pay more attention to data analytics and so-called “big data” – although the first faltering steps to unlock benefits, here, have often ended in the discovery that using large-scale data is a far more nuanced business than was initially assumed

Debates around government and data have often been extremely high-profile – the NHS care.data [27] debate was raging while this review was in progress – but they are also shrouded in terms that can generate confusion and complexities that are not easily summarized.

In this chapter we will unpick some of these terms and some parts of the debate. This is a detailed and complex area and there is much more that could have been included [28]. This is not an area that can easily be summarized into a simple bullet-pointed list of policies.

Within this report we will use the following terms and definitions, proceeding to a detailed analysis of each in turn:

Type of Data

Definition [29]

Examples

1. Open Data Data that can be freely used, reused and redistributed by anyone – subject only, at most, to the requirement to attribute and sharealike Insolvency notices in the London Gazette
Government spending information
Public transport information
Official National Statistics
2. Shared Data Restricted data provided to restricted organisations or individuals for restricted purposes National Pupil Database
NHS care.data
Integrated health and social care
Individual census returns
3. Personal Data Data that relate to a living individual who can be identified from that data. For full legal definition see [30] Health records
Individual tax records
Insolvency notices in the London gazette
National Pupil Database
NB These definitions overlap. Personal data can exist in both open and shared data.

This social productivity will help build future economic productivity; in the meantime it will improve people’s lives and it will enhance our democracy. From our analysis it was clear that there was room for improvement…”

White House: Help Shape Public Participation


Corinna Zarek and Justin Herman at the White House Blog: “Public participation — where citizens help shape and implement government programs — is a foundation of open, transparent, and engaging government services. From emergency management and regulatory development to science and education, better and more meaningful engagement with those who use public services can measurably improve government for everyone.
A team across the government is now working side-by-side with civil society organizations to deliver the first U.S. Public Participation Playbook, dedicated to providing best practices for how agencies can better design public participation programs, and suggested performance metrics for evaluating their effectiveness.
Developing a U.S. Public Participation Playbook has been an open government priority, and was included in both the first and second U.S. Open Government National Action Plans as part of the United States effort to increase public integrity in government programs. This resource reflects the commitment of the government and civic partners to measurably improve participation programs, and is designed using the same inclusive principles that it champions.
More than 30 Federal leaders from across diverse missions in public service have collaborated on draft best practices, or “plays,” lead by the General Services Administration’s inter-agency SocialGov Community. The playbook is not limited to digital participation, and is designed to address needs from the full spectrum of public participation programs.
The plays are structured to provide best practices, tangible examples, and suggested performance metrics for government activities that already exist or are under development. Some categories included in the plays include encouraging community development and outreach, empowering participants through public/private partnerships, using data to drive decisions, and designing for inclusiveness and accessibility.
In developing this new resource, the team has been reaching out to more than a dozen civil society organizations and stakeholders, asking them to contribute as the Playbook is created. The team would like your input as well! Over the next month, contribute your ideas to the playbook using Madison, an easy-to-use, open source platform that allows for accountable review of each contribution.
Through this process, the team will work together to ensure that the Playbook reflects the best ideas and examples for agencies to use in developing and implementing their programs with public participation in mind. This resource will be a living document, and stakeholders from inside or outside of government should continually offer new insights — whether new plays, the latest case studies, or the most current performance metrics — to the playbook.
We look forward to seeing the public participate in the creation and evolution of the Public Participation Playbook!”

Look to Government—Yes, Government—for New Social Innovations


Paper by Christian Bason and Philip Colligan: “If asked to identify the hotbed of social innovation right now, many people would likely point to the new philanthropy of Silicon Valley or the social entrepreneurship efforts supported by Ashoka, Echoing Green, and Skoll Foundation. Very few people, if any, would mention their state capital or Capitol Hill. While local and national governments may have promulgated some of the greatest advances in human history — from public education to putting a man on the moon — public bureaucracies are more commonly known to stifle innovation.
Yet, around the world, there are local, regional, and national government innovators who are challenging this paradigm. They are pioneering a new form of experimental government — bringing new knowledge and practices to the craft of governing and policy making; drawing on human-centered design, user engagement, open innovation, and cross-sector collaboration; and using data, evidence, and insights in new ways.
Earlier this year, Nesta, the UK’s innovation foundation (which Philip helps run), teamed up with Bloomberg Philanthropies to publish i-teams, the first global review of public innovation teams set up by national and city governments. The study profiled 20 of the most established i-teams from around the world, including:

  • French Experimental Fund for Youth, which has supported more than 554 experimental projects (such as one that reduces school drop-out rates) that have benefited over 480,000 young people;
  • Nesta’s Innovation Lab, which has run 70 open innovation challenges and programs supporting over 750 innovators working in fields as diverse as energy efficiency, healthcare, and digital education;
  • New Orleans’ Innovation and Delivery team, which achieved a 19% reduction in the number of murders in the city in 2013 compared to the previous year.

How are i-teams achieving these results? The most effective ones are explicit about the goal they seek – be it creating a solution to a specific policy challenge, engaging citizenry in behaviors that help the commonweal, or transforming the way government behaves. Importantly, these teams are also able to deploy the right skills, capabilities, and methods for the job.
In addition, ­i-teams have a strong bias toward action. They apply academic research in behavioral economics and psychology to public policy and services, focusing on rapid experimentation and iteration. The approach stands in stark contrast to the normal routines of government.
Take for example, The UK’s Behavioural Insights Team (BIT), often called the Nudge Unit. It sets clear goals, engages the right expertise to prototype means to the end, and tests innovations rapidly in the field, to learn what’s not working and rapidly scales what is.
One of BIT’s most famous projects changed taxpayer behavior. BIT’s team of economists, behavioral psychologists, and seasoned government staffers came up with minor changes to tax letters, sent out by the UK Government, that subtlety introduced positive peer pressure. By simply altering the letters to say that most people in their local area had already paid their taxes, BIT was able to boost repayment rates by around 5%. This trial was part of a range of interventions, which have helped forward over £200 million in additional tax revenue to HM Revenue & Customs, the UK’s tax authority.
The Danish government’s internal i-team, MindLab (which Christian ran for 8 years) has likewise influenced citizen behavior….”

USDA Opens VIVO Research Networking Tool to Public


 Sharon Durham at the USDA: VIVO, a Web application used internally by U.S. Department of Agriculture (USDA) scientists since 2012 to allow better national networking across disciplines and locations, is now available to the public. USDA VIVO will be a “one-stop shop” for Federal agriculture expertise and research outcomes.”USDA employs over 5,000 researchers to ensure our programs are based on sound public policy and the best available science,” said USDA Chief Scientist and Undersecretary for Research, Education, and Economics Dr. Catherine Woteki. “USDA VIVO provides a powerful Web search tool for connecting interdisciplinary researchers, research projects and outcomes with others who might bring a different approach or scope to a research project. Inviting private citizens to use the system will increase the potential for collaboration to solve food- and agriculture-related problems.”
The idea behind USDA VIVO is to link researchers with peers and potential collaborators to ignite synergy among our nation’s best scientific minds and to spark unique approaches to some of our toughest agricultural problems. This efficient networking tool enables scientists to easily locate others with a particular expertise. VIVO also makes it possible to quickly identify scientific expertise and respond to emerging agricultural issues, like specific plant and animal disease or pests.
USDA’s Agricultural Research Service (ARS), Economic Research Service, National Institute of Food and Agriculture, National Agricultural Statistics Service and Forest Service are the first five USDA agencies to participate in VIVO. The National Agricultural Library, which is part of ARS, will host the Web application. USDA hopes to add other agencies in the future.
VIVO was in part developed under a $12.2 million grant from the National Center for Research Resources, part of the National Institutes of Health (NIH). The grant, made under the 2009 American Recovery and Reinvestment Act, was provided to the University of Florida and collaborators at Cornell University, Indiana University, Weill Cornell Medical College, Washington University in St. Louis, the Scripps Research Institute and the Ponce School of Medicine.
VIVO’s underlying database draws information about research being conducted by USDA scientists from official public systems of record and then makes it uniformly available for searching. The data can then be easily leveraged in other applications. In this way, USDA is also making its research projects and related impacts available to the Federal RePORTER tool, released by NIH on September 22, 2014. Federal RePORTER is part of a collaborative effort between Federal entities and other research institutions to create a repository that will be useful to assess the impact of Federal research and development investments.”

Personalised Health and Care 2020: Using Data and Technology to Transform Outcomes for Patients and Citizens


Report and Framework of Action by the UK National Information Board: “One of the greatest opportunities of the 21st century is the potential to safely harness the power of the technology revolution, which has transformed our society, to meet the challenges of improving health and providing better, safer, sustainable care for all. To date the health and care system has only begun to exploit the potential of using data and technology at a national or local level. Our ambition is for a health and care system that enables people to make healthier choices, to be more resilient, to deal more effectively with illness and disability when it arises, and to have happier, longer lives in old age; a health and care system where technology can help tackle inequalities and improve access to services for the vulnerable.
The purpose of this paper is to consider what progress the health and care system has already made and what can be learnt from other industries and the wider economy…”

A New Ebola Crisis Page Built with Open Data


HDX team: “We are introducing a new Ebola crisis page that provides an overview of the data available in HDX. The page includes an interactive map of the worst-affected countries, the top-line figures for the crisis, a graph of cumulative Ebola cases and deaths, and over 40 datasets.
We have been working closely with UNMEER and WHO to make Ebola data available for public use. We have also received important contributions from the British Red Cross, InterAction, MapAction, the Standby Task Force, the US Department of Defense, and WFP, among others.

How we built it

The process to create this page started a couple of months ago by simply linking to existing data sites, such as Open Street Map’s geospatial data or OCHA’s common operational datasets. We then created a service by extracting the data on Ebola cases and deaths from the bi-weekly WHO situation report and making the raw files available for analysts and developers.
The OCHA Regional Office in Dakar contributed a dataset that included Ebola cases by district, which they had been collecting from reports by the national Ministries of Health since March 2014. This data was picked up by The New York Times graphics team and by Gapminder which partnered with Google Crisis Response to add the data to the Google Public Data Explorer.

As more organizations shared Ebola datasets through HDX, users started to transform the data into useful graphs and maps. These visuals were then shared back with the wider community through the HDX gallery. We have incorporated many of these user-generated visual elements into the design of our new Ebola crisis page….”
See also Hacking Ebola.

Hashtag Standards For Emergencies


Key Findings of New Report by the UN Office for the Coordination of Humanitarian Affairs:”

  • The public is using Twitter for real-time information exchange and for expressing emotional support during a variety of crises, such as wildfires, earthquakes, floods, hurricanes, political protests, mass shootings, and communicable-disease tracking.31 By encouraging proactive standardization of hashtags, emergency responders may be able to reduce a big-data challenge and better leverage crowdsourced information for operational planning and response.
  • Twitter is the primary social media platform discussed in this Think Brief. However, the use of hashtags has spread to other social media platforms, including Sina Weibo, Facebook, Google+ and Diaspora. As a result, the ideas behind hashtag standardization may have a much larger sphere of influence than just this one platform.
  • Three hashtag standards are encouraged and discussed: early standardization of the disaster name (e.g., #Fay), how to report non-emergency needs (e.g., #PublicRep) and requesting emergency assistance (e.g., #911US).
  • As well as standardizing hashtags, emergency response agencies should encourage the public to enable Global Positioning System (GPS) when tweeting during an emergency. This will provide highly detailed information to facilitate response.
  • Non-governmental groups, national agencies and international organizations should discuss the potential added value of monitoring social media during emergencies. These groups need to agree who is establishing the standards for a given country or event, which agency disseminates these prescriptive messages, and who is collecting and validating the incoming crowdsourced reports.
  • Additional efforts should be pursued regarding how to best link crowdsourced information into emergency response operations and logistics. If this information will be collected, the teams should be ready to act on it in a timely manner.”

Politics, Policy and Privatisation in the Everyday Experience of Big Data in the NHS


Chapter by Andrew Goffey ; Lynne Pettinger and Ewen Speed in Martin Hand , Sam Hillyard (ed.) Big Data? Qualitative Approaches to Digital Research (Studies in Qualitative Methodology, Volume 13) : “This chapter explains how fundamental organisational change in the UK National Health Service (NHS) is being effected by new practices of digitised information gathering and use. It analyses the taken-for-granted IT infrastructures that lie behind digitisation and considers the relationship between digitisation and big data.
Design/methodology/approach

Qualitative research methods including discourse analysis, ethnography of software and key informant interviews were used. Actor-network theories, as developed by Science and technology Studies (STS) researchers were used to inform the research questions, data gathering and analysis. The chapter focuses on the aftermath of legislation to change the organisation of the NHS.

Findings

The chapter shows the benefits of qualitative research into specific manifestations information technology. It explains how apparently ‘objective’ and ‘neutral’ quantitative data gathering and analysis is mediated by complex software practices. It considers the political power of claims that data is neutral.

Originality/value

The chapter provides insight into a specific case of healthcare data and. It makes explicit the role of politics and the State in digitisation and shows how STS approaches can be used to understand political and technological practice.”