Accelerating Citizen Science and Crowdsourcing to Address Societal and Scientific Challenges


Tom Kalil et al at the White House Blog: “Citizen science encourages members of the public to voluntarily participate in the scientific process. Whether by asking questions, making observations, conducting experiments, collecting data, or developing low-cost technologies and open-source code, members of the public can help advance scientific knowledge and benefit society.

Through crowdsourcing – an open call for voluntary assistance from a large group of individuals – Americans can study and tackle complex challenges by conducting research at large geographic scales and over long periods of time in ways that professional scientists working alone cannot easily duplicate. These challenges include understanding the structure of proteins related viruses in order to support development of new medications, or preparing for, responding to, and recovering from disasters.

…OSTP is today announcing two new actions that the Administration is taking to encourage and support the appropriate use of citizen science and crowdsourcing at Federal agencies:

  1. OSTP Director John Holdren, is issuing a memorandum entitled Addressing Societal and Scientific Challenges through Citizen Science and Crowdsourcing. This memo articulates principles that Federal agencies should embrace to derive the greatest value and impact from citizen science and crowdsourcing projects. The memo also directs agencies to take specific actions to advance citizen science and crowdsourcing, including designating an agency-specific coordinator for citizen science and crowdsourcing projects, and cataloguing citizen science and crowdsourcing projects that are open for public participation on a new, centralized website to be created by the General Services Administration: making it easy for people to find out about and join in these projects.
  2. Fulfilling a commitment made in the 2013 Open Government National Action Plan, the U.S. government is releasing the first-ever Federal Crowdsourcing and Citizen Science Toolkit to help Federal agencies design, carry out, and manage citizen science and crowdsourcing projects. The toolkit, which was developed by OSTP in partnership with the Federal Community of Practice for Crowdsourcing and Citizen Science and GSA’s Open Opportunities Program, reflects the input of more than 125 Federal employees from over 25 agencies on ideas, case studies, best management practices, and other lessons to facilitate the successful use of citizen science and crowdsourcing in a Federal context….(More)”

 

Open Science Revolution – New Ways of Publishing Research in The Digital Age


Scicasts: “A massive increase in the power of digital technology over the past decade allows us today to publish any article, blog post or tweet in a matter of seconds.

Much of the information on the web is also free – newspapers are embracing open access to their articles and many websites are copyrighting their content under the Creative Commons licenses, most of which allow the re-use and sharing of the original work at no cost.

As opposed to this openness, science publishing is still lagging behind. Most of the scientific knowledge generated in the past two centuries is hidden behind a paywall, requiring an average reader to pay tens to hundreds of euros to access an original study report written by scientists.

Can we not do things differently?

An answer to this question led to the creation of a number of new concepts that emerged over the past few years. A range of innovative open online science platforms are now trying “to do things differently”, offering researchers alternative ways of publishing their discoveries, making the publishing process faster and more transparent.

Here is a handful of examples, implemented by three companies – a recently launched open access journal Research Ideas and Outcomes (RIO), an open publishing platform F1000Research from The Faculty of 1000 and a research and publishing network ScienceOpen. Each has something different to offer, yet all of them seem to agree that science research should be open and accessible to everyone.

New concept – publish all research outputs

While the two-centuries-old tradition of science publishing lives and dies on exposing only the final outcomes of a research project, the RIO journal suggests a different approach. If we can follow new stories online step by step as they unfold (something that journalists have figured out and use in live reporting), they say, why not apply similar principles to research projects?

“RIO is the first journal that aims at publishing the whole research cycle and definitely the first one, to my knowledge, that tries to do that across all science branches – all of humanities, social sciences, engineering and so on,” says a co-founder of the RIO journal, Prof. Lyubomir Penev, in an interview to Scicasts.

From the original project outline, to datasets, software and methodology, each part of the project can be published separately. “The writing platform ARPHA, which underpins RIO, handles the whole workflow – from the stage when you write the first letter, to the end,” explains Prof. Penev.

At an early stage, the writing process is closed from public view and researchers may invite their collaborators and peers to view their project, add data and contribute to its development. Scientists can choose to publish any part of their project as it progresses – they can submit to the open platform their research idea, hypothesis or a newly developed experimental protocol, alongside future datasets and whole final manuscripts.

Some intermediate research stages and preliminary results can also be submitted to the platform F1000Research, which developed their own online authoring tool F1000Workspace, similar to ARPHA….(More)”

Ethical, Safe, and Effective Digital Data Use in Civil Society


Blog by Lucy Bernholz, Rob Reich, Emma Saunders-Hastings, and Emma Leeds Armstrong: “How do we use digital data ethically, safely, and effectively in civil society. We have developed three early principles for consideration:

  • Default to person-centered consent.
  • Prioritize privacy and minimum viable data collection.
  • Plan from the beginning to open (share) your work.

This post provides a synthesis from a one day workshop that informed these principles. It concludes with links to draft guidelines you can use to inform partnerships between data consultants/volunteers and nonprofit organizations….(More)

These three values — consent, minimum viable data collection, and open sharing- comprise a basic framework for ethical, safe, and effective use of digital data by civil society organizations. They should be integrated into partnerships with data intermediaries and, perhaps, into general data practices in civil society.

We developed two tools to guide conversations between data volunteers and/or consultants and nonprofits. These are downloadable below. Please use them, share them, improve them, and share them again….

  1. Checklist for NGOs and external data consultants
  2. Guidelines for NGOs and external data consultants (More)”

Public service coding: the BBC as an open software developer


Juan Mateos-Garcia at NESTA: “On Monday, the BBC published British, Bold, Creative, a paper where it put forward a vision for its future based on openness and collaboration with its audiences and the UK’s wider creative industries.

In this blog post, we focus on an area where the BBC is already using an open and collaborative model for innovation: software development.

The value of software

Although less visible to the public than its TV, radio and online content programming, the BBC’s software development activities may create value and drive innovation beyond the BBC, providing an example of how the corporation can put its “technology and digital capabilities at the service of the wider industry.

Software is an important form of innovation investment that helps the BBC deliver new products and services, and become more efficient. One might expect that much of the software developed by the BBC would also be of value to other media and digital organisations. Such beneficial “spillovers” are encouraged by the BBC’s use of open source licensing, which enables other organisations to download its software for free, change it as they see fit, and share the results.

Current debates about the future of the BBC – including the questions about its role in influencing the future technology landscape in the Government’s Charter Review Consultation – need to be informed by robust evidence about how it develops software, and the impact that this has.

In this blog post, we use data from the world’s biggest collaborative software development platform, GitHub, to study the BBC as an open software developer.

GitHub gives organisations and individuals hosting space to store their projects (referred to as “repos”), and tools to coordinate development. This includes the option to “fork” (copy) other users’ software, change it and redistribute the improvements. Our key questions are:

  • How active is the BBC on GitHub?
  • How has its presence on GitHub changed over time?
  • What is the level of adoption (forking) of BBC projects on GitHub?
  • What types of open source projects is the BBC developing?
  • Where in the UK and in the rest of the world are the people interested in BBC projects based?

But before tackling these questions, it is important to address a question often raised in relation to open source software:

Why might an organisation like the BBC want to share its valuable code on a platform like GitHub?

There are several possible reasons:

  • Quality: Opening up a software project attracts help from other developers, making it better
  • Adoption: Releasing software openly can help turn it into a widely adopted standard
  • Signalling: It signals the organisation as an interesting place to work and partner with
  • Public value: Some organisations release their code openly with the explicit goal of creating public value

The webpage introducing TAL (Television Application Layer), a BBC project on GitHub, is a case in point: “Sharing TAL should make building applications on TV easier for others, helping to drive the uptake of this nascent technology. The BBC has a history of doing this and we are always looking at new ways to reach our audience.”…(More)

US citizenship and immigration services to host Twitter ‘office hours’


NextGov: “U.S. Citizenship and Immigration Services wants to get more social with prospective immigrants.

USCIS will host its first-ever Twitter office hours Tuesday from 3-4 p.m. using the #AskUSCIS hashtag. Agency officials hope to provide another avenue for customers to ask questions and receive real-time feedback, according to a USCIS blog post.

To participate, customers just have to follow @USCIS on Twitter, use the hashtag and ask away, although the blog post makes clear staff won’t answer case-specific questions and case status updates.

The post also warns Twitter users not to post Social Security numbers, receipt numbers or any other personally identifiable information.

“With Twitter office hours, we want to help you – either as you’re preparing forms or after you’ve filed,” the blog post states.

USCIS will post a transcript of the questions and answers to its blog following the Twitter office hours, and if the concept is successful, the agency plans to host the sessions on a regular basis.

The agency’s social outreach plan is part of broader effort among federal agencies to improve customer experience.

This particular variant of digital engagement mirrors an effort championed by the Office of Federal Student Aid. FAFSA answers questions on the last Wednesday of every month using the hashtag #AskFAFSA, an effort that’s helped build its digital engagement significantly in recent years….(More)”

Review Federal Agencies on Yelp…and Maybe Get a Response


Yelp Official Blog: “We are excited to announce that Yelp has concluded an agreement with the federal government that will allow federal agencies and offices to claim their Yelp pages, read and respond to reviews, and incorporate that feedback into service improvements.

We encourage Yelpers to review any of the thousands of agency field offices, TSA checkpoints, national parks, Social Security Administration offices, landmarks and other places already listed on Yelp if you have good or bad feedback to share about your experiences. Not only is it helpful to others who are looking for information on these services, but you can actually make an impact by sharing your feedback directly with the source.

It’s clear Washington is eager to engage with people directly through social media. Earlier this year a group of 46 lawmakers called for the creation of a “Yelp for Government” in order to boost transparency and accountability, and Representative Ron Kind reiterated this call in a letter to the General Services Administration (GSA). Luckily for them, there’s no need to create a new platform now that government agencies can engage directly on Yelp.

As this agreement is fully implemented in the weeks and months ahead, we’re excited to help the federal government more directly interact with and respond to the needs of citizens and to further empower the millions of Americans who use Yelp every day.

In addition to working with the federal government, last week we announced our our partnership with ProPublica to incorporate health care statistics and consumer opinion survey data onto the Yelp business pages of more than 25,000 medical treatment facilities. We’ve also partnered with local governments in expanding the LIVES open data standard to show restaurant health scores on Yelp….(More)”

What factors influence transparency in US local government?


Grichawat Lowatcharin and Charles Menifield at LSE Impact Blog: “The Internet has opened a new arena for interaction between governments and citizens, as it not only provides more efficient and cooperative ways of interacting, but also more efficient service delivery, and more efficient transaction activities. …But to what extent does increased Internet access lead to higher levels of government transparency? …While we found Internet access to be a significant predictor of Internet-enabled transparency in our simplest model, this finding did not hold true in our most extensive model. This does not negate that fact that the variable is an important factor in assessing transparency levels and Internet access. …. Our data shows that total land area, population density, percentage of minority, education attainment, and the council-manager form of government are statistically significant predictors of Internet-enabled transparency.  These findings both confirm and negate the findings of previous researchers. For example, while the effect of education on transparency appears to be the most consistent finding in previous research, we also noted that the rural/urban (population density) dichotomy and the education variable are important factors in assessing transparency levels. Hence, as governments create strategic plans that include growth models, they should not only consider the budgetary ramifications of growth, but also the fact that educated residents want more web based interaction with government. This finding was reinforced by a recent Census Bureau report indicating that some of the cities and counties in Florida and California had population increases greater than ten thousand persons per month during the period 2013-2014.

This article is based on the paper ‘Determinants of Internet-enabled Transparency at the Local Level: A Study of Midwestern County Web Sites’, in State and Local Government Review. (More)”

Yelp’s Consumer Protection Initiative: ProPublica Partnership Brings Medical Info to Yelp


Yelp Official Blog: “…exists to empower and protect consumers, and we’re continually focused on how we can enhance our service while enhancing the ability for consumers to make smart transactional decisions along the way.

A few years ago, we partnered with local governments to launch the LIVES open data standard. Now, millions of consumers find restaurant inspection scores when that information is most relevant: while they’re in the middle of making a dining decision (instead of when they’re signing the check). Studies have shown that displaying this information more prominently has a positive impact.

Today we’re excited to announce we’ve joined forces with ProPublica to incorporate health care statistics and consumer opinion survey data onto the Yelp business pages of more than 25,000 medical treatment facilities. Read more in today’s Washington Post story.

We couldn’t be more excited to partner with ProPublica, the Pulitzer Prize winning non-profit newsroom that produces investigative journalism in the public interest.

The information is compiled by ProPublica from their own research and the Centers for Medicare and Medicaid Services (CMS) for 4,600 hospitals, 15,000 nursing homes, and 6,300 dialysis clinics in the US and will be updated quarterly. Hover text on the business page will explain the statistics, which include number of serious deficiencies and fines per nursing home and emergency room wait times for hospitals. For example, West Kendall Baptist Hospital has better than average doctor communication and an average 33 minute ER wait time, Beachside Nursing Center currently has no deficiencies, and San Mateo Dialysis Center has a better than average patient survival rate.

Now the millions of consumers who use Yelp to find and evaluate everything from restaurants to retail will have even more information at their fingertips when they are in the midst of the most critical life decisions, like which hospital to choose for a sick child or which nursing home will provide the best care for aging parents….(More)

The Data Divide: What We Want and What We Can Get


Craig Adelman and Erin Austin at Living Cities (Read Blog 1):There is no shortage of data. At every level–federal, state, county, city and even within our own organizations–we are collecting and trying to make use of data. Data is a catch-all term that suggests universal access and easy use. The problem? In reality, data is often expensive, difficult to access, created for a single purpose, quickly changing and difficult to weave together. To aid and inform future data-dependent research initiatives, we’ve outlined the common barriers that community development faces when working with data and identified three ways to overcome them.

Common barriers include:

  • Data often comes at a hefty price. …
  • Data can come with restrictions and regulations. …
  • Data is built for a specific purpose, meaning information isn’t always in the same place. …
  • Data can actually be too big. ….
  • Data gaps exist. …
  • Data can be too old. ….

As you can tell, there can be many complications when it comes to working with data, but there is still great value to using and having it. We’ve found a few way to overcome these barriers when scoping a research project:

1) Prepare to have to move to “Plan B” when trying to get answers that aren’t readily available in the data. It is incredibly important to be able to react to unexpected data conditions and to use proxy datasets when necessary in order to efficiently answer the core research question.

2) Building a data budget for your work is also advisable, as you shouldn’t anticipate that public entities or private firms will give you free data (nor that community development partners will be able to share datasets used for previous studies).

3) Identifying partners—including local governments, brokers, and community development or CDFI partners—is crucial to collecting the information you’ll need….(More)

Four things policy-makers need to know about social media data and real time analytics.


Ella McPherson at LSE’s Impact Blog: “I recently gave evidence to the House of Commons Science and Technology Select Committee. This was based on written evidence co-authored with my colleague, Anne Alexander, and submitted to their ongoing inquiry into social media data and real time analytics. Both Anne and I research the use of social media during contested times; Anne looks at its use by political activists and labour movement organisers in the Arab world, and I look at its use in human rights reporting. In both cases, the need to establish facticity is high, as is the potential for the deliberate or inadvertent falsification of information. Similarly to the case that Carruthers makes about war reporting, we believe that the political-economic, methodological, and ethical issues raised by media dynamics in the context of crisis are bellwethers for the dynamics in more peaceful and mundane contexts.

From our work we have learned four crucial lessons that policy-makers considering this issue should understand:

1.  Social media information is vulnerable to a variety of distortions – some typical of all information, and others more specific to the characteristics of social media communications….

2.  If social media information is used to establish events, it must be verified; while technology can hasten this process, it is unlikely to ever occur real time due to the subjective, human element of judgment required….

 

3.  Verifying social media information may require identifying its source, which has ethical implications related to informed consent and anonymisation….

4.  Another way to think about social media information is as what Hermida calls an ‘awareness system,’ which reduces the need to collect source identities; under this approach, researchers look at volume rather than veracity to recognise information of interest… (More)