Let’s kickstart science in America


David Lang at Ideas.Ted: “Science funding is broken. To fix it, we need to empower a new class of makers, citizen scientists and explorers
The troubling state of science funding in America goes by many names: sequestration, the profzi scheme, the postdocalypse. Because it can take extensive planning over years in academia to gain research funds from the National Science Foundation and the National Institutes of Health, “nobody takes risks anymore,” writes one researcher in his “Goodbye Academia” letter. “Nobody young jumps and tries totally new things, because it’s almost surely a noble way to suicide your career.” The result? We are on the verge of losing a generation of scientists at the exact moment we need to embolden them. Biologist Michael Eisen sums up the effects of the funding crunch: “It is an amazing time to do science, but an incredibly difficult time to be a scientist.”
It’s not all bad news for the thousands of science and conservation ideas that fall outside the traditional funding rubric. Fortunately, new citizen science models are emerging — along with a new class of philanthropic backers to fill the funding voids left by the NSF and the NIH. Our experience developing OpenROV (an open-source underwater robot) into one of the largest (by volume) underwater robot manufacturers in the world is illustrative of this shift.
Looking back at the sequence of events, it seems improbable that such a small amount of initial funding could have made such a large impact, but it makes perfect sense when you break down all the contributing factors. Two years ago, we weren’t even part of the oceanographic community. Our ideas and techniques were outside the playbook for experienced ocean engineers. And since we only had enough money to test the first thing, not the whole thing, we started by creating a prototype. Using TechShop equipment in San Francisco, we able to create several iterations of a low-cost underwater robot that was suitable for our purpose: exploring an underwater cave and looking for lost treasure. After sharing our designs online, we found a community of like-minded developers. Together we raised over $100,000 on Kickstarter to do a first run of manufacturing.
This experience made us think: How can we make more microsponsorship opportunities available in science, exploration and conservation? OpenExplorer was our response. Instead of providing seed funding, we’ve created a model that gives everyone a chance to sponsor new ideas, research and expeditions in science and engineering. One success: TED Fellow Asha de Vos‘s work on preventing the ship strike of blue whales in the Indian Ocean….”

Designing a Citizen Science and Crowdsourcing Toolkit for the Federal Government


Jenn Gustetic, Lea Shanley, Jay Benforado, and Arianne Miller at the White House Blog: “In the 2013 Second Open Government National Action Plan, President Obama called on Federal agencies to harness the ingenuity of the public by accelerating and scaling the use of open innovation methods, such as citizen science and crowdsourcing, to help address a wide range of scientific and societal problems.
Citizen science is a form of open collaboration in which members of the public participate in the scientific process, including identifying research questions, collecting and analyzing data, interpreting results, and solving problems. Crowdsourcing is a process in which individuals or organizations submit an open call for voluntary contributions from a large group of unknown individuals (“the crowd”) or, in some cases, a bounded group of trusted individuals or experts.
Citizen science and crowdsourcing are powerful tools that can help Federal agencies:

  • Advance and accelerate scientific research through group discovery and co-creation of knowledge. For instance, engaging the public in data collection can provide information at resolutions that would be difficult for Federal agencies to obtain due to time, geographic, or resource constraints.
  • Increase science literacy and provide students with skills needed to excel in science, technology, engineering, and math (STEM). Volunteers in citizen science or crowdsourcing projects gain hands-on experience doing real science, and take that learning outside of the classroom setting.
  • Improve delivery of government services with significantly lower resource investments.
  • Connect citizens to the missions of Federal agencies by promoting a spirit of open government and volunteerism.

To enable effective and appropriate use of these new approaches, the Open Government National Action Plan specifically commits the Federal government to “convene an interagency group to develop an Open Innovation Toolkit for Federal agencies that will include best practices, training, policies, and guidance on authorities related to open innovation, including approaches such as incentive prizes, crowdsourcing, and citizen science.”
On November 21, 2014, the Office of Science and Technology Policy (OSTP) kicked off development of the Toolkit with a human-centered design workshop. Human-centered design is a multi-stage process that requires product designers to engage with different stakeholders in creating, iteratively testing, and refining their product designs. The workshop was planned and executed in partnership with the Office of Personnel Management’s human-centered design practice known as “The Lab” and the Federal Community of Practice on Crowdsourcing and Citizen Science (FCPCCS), a growing network of more than 100 employees from more than 20 Federal agencies….
The Toolkit will help further the culture of innovation, learning, sharing, and doing in the Federal citizen science and crowdsourcing community: indeed, the development of the Toolkit is a collaborative and community-building activity in and of itself.
The following successful Federal projects illustrate the variety of possible citizen science and crowdsourcing applications:

  • The Citizen Archivist Dashboard (NARA) coordinates crowdsourced archival record tagging and document transcription. Recently, more than 170,000 volunteers indexed 132 million names of the 1940 Census in only five months, which NARA could not have done alone.
  • Through Measuring Broadband America (FCC), 2 million volunteers collected and provided the FCC with data on their Internet speeds, data that FCC used to create a National Broadband Map revealing digital divides.
  • In 2014, Nature’s Notebook (USGS, NSF) volunteers recorded more than 1 million observations on plants and animals that scientists use to analyze environmental change.
  • Did You Feel It? (USGS) has enabled more than 3 million people worldwide to share their experiences during and immediately after earthquakes. This information facilitates rapid damage assessments and scientific research, particularly in areas without dense sensor networks.
  • The mPING (NOAA) mobile app has collected more than 600,000 ground-based observations that help verify weather models.
  • USAID anonymized and opened its loan guarantee data to volunteer mappers. Volunteers mapped 10,000 data points in only 16 hours, compared to the 60 hours officials expected.
  • The Air Sensor Toolbox (EPA), together with training workshops, scientific partners, technology evaluations, and a scientific instrumentation loan program, empowers communities to monitor and report local air pollution.

In early 2015, OSTP, in partnership with the Challenges and Prizes Community of Practice, will convene Federal practitioners to develop the other half of the Open Innovation Toolkit for prizes and challenges. Stay tuned!”
 

NASA Launches New Citizen Science Website


 

NASASolveRobert McNamara  at Commons Lab:
 
NASASolve debuted last month as a one-stop-shop for prizes and challenges that are seeking contributions from people like you. Don’t worry you need not be a rocket scientist to apply. The general public is encouraged to contribute to solving a variety of challenges facing NASA in reaching its mission goals. From hunting asteroids to re-designing Balance Mass for the Mars Lander, there are multitudes of ways for you to be a part of the nation’s space program.
Crowdsourcing the public for innovative solutions is something that NASA has been engaged in since 2005.  But as NASA’s chief technologist points out, “NASASolve is a great way for members of the public and other citizen scientists to see all NASA prizes and challenges in one location.”  The new site hopes to build on past successes like the Astronaut Glove Challenge, the ISS Longeron Challenge and the Zero Robotics Video Challenge. “Challenges are one tool to tap the top talent and best ideas. Partnering with the community to get ideas and solutions is important for NASA moving forward,” says Jennifer Gustetic, Program Executive of NASA Prizes and Challenges.
In order to encourage more active public participation, millions of dollars and scholarships have been set aside to reward those whose ideas and solutions succeed in taking on NASA’s challenges. If you want to get involved, visit NASASolve for more information and the current list of challenges waiting for solutions….

Citizen Science: The Law and Ethics of Public Access to Medical Big Data


New Paper by Sharona Hoffman: Patient-related medical information is becoming increasingly available on the Internet, spurred by government open data policies and private sector data sharing initiatives. Websites such as HealthData.gov, GenBank, and PatientsLikeMe allow members of the public to access a wealth of health information. As the medical information terrain quickly changes, the legal system must not lag behind. This Article provides a base on which to build a coherent data policy. It canvasses emergent data troves and wrestles with their legal and ethical ramifications.
Publicly accessible medical data have the potential to yield numerous benefits, including scientific discoveries, cost savings, the development of patient support tools, healthcare quality improvement, greater government transparency, public education, and positive changes in healthcare policy. At the same time, the availability of electronic personal health information that can be mined by any Internet user raises concerns related to privacy, discrimination, erroneous research findings, and litigation. This Article analyzes the benefits and risks of health data sharing and proposes balanced legislative, regulatory, and policy modifications to guide data disclosure and use.”

Agency Liability Stemming from Citizen-Generated Data


Paper by Bailey Smith for The Wilson Center’s Science and Technology Innovation Program: “New ways to gather data are on the rise. One of these ways is through citizen science. According to a new paper by Bailey Smith, JD, federal agencies can feel confident about using citizen science for a few reasons. First, the legal system provides significant protection from liability through the Federal Torts Claim Act (FTCA) and Administrative Procedures Act (APA). Second, training and technological innovation has made it easier for the non-scientist to collect high quality data.”

Every citizen a scientist? An EU project tries to change the face of research


Project News from the European Commission:  “SOCIENTIZE builds on the concept of ‘Citizen Science’, which sees thousands of volunteers, teachers, researchers and developers put together their skills, time and resources to advance scientific research. Thanks to open source tools developed under the project, participants can help scientists collect data – which will then be analysed by professional researchers – or even perform tasks that require human cognition or intelligence like image classification or analysis.

Every citizen can be a scientist
The project helps usher in new advances in everything from astronomy to social science.
‘One breakthrough is our increased capacity to reproduce, analyse and understand complex issues thanks to the engagement of large groups of volunteers,’ says Mr Fermin Serrano Sanz, researcher at the University of Zaragoza and Project Coordinator of SOCIENTIZE. ‘And everyone can be a neuron in our digitally-enabled brain.’
But how can ordinary citizens help with such extraordinary science? The key, says Mr Serrano Sanz, is in harnessing the efforts of thousands of volunteers to collect and classify data. ‘We are already gathering huge amounts of user-generated data from the participants using their mobile phones and surrounding knowledge,’ he says.
For example, the experiment ‘SavingEnergy@Home’ asks users to submit data about the temperatures in their homes and neighbourhoods in order to build up a clearer picture of temperatures in cities across the EU, while in Spain, GripeNet.es asks citizens to report when they catch the flu in order to monitor outbreaks and predict possible epidemics.
Many Hands Make Light Work
But citizens can also help analyse data. Even the most advanced computers are not very good at recognising things like sun spots or cells, whereas people can tell the difference between living and dying cells very easily, given only a short training.
The SOCIENTIZE projects ‘Sun4All’ and ‘Cell Spotting’ ask volunteers to label images of solar activity and cancer cells from an application on their phone or computer. With Cell Spotting, for instance, participants can observe cell cultures being studied with a microscope in order to determine their state and the effectiveness of medicines. Analysing this data would take years and cost hundreds of thousands of euros if left to a small team of scientists – but with thousands of volunteers helping the effort, researchers can make important breakthroughs quickly and more cheaply than ever before.
But in addition to bringing citizens closer to science, SOCIENTIZE also brings science closer to citizens. On 12-14 June, the project participated in the SONAR festival with ‘A Collective Music Experiment’ (CME). ‘Two hundred people joined professional DJs and created musical patterns using a web tool; participants shared their creations and re-used other parts in real time. The activity in the festival also included a live show of RdeRumba and Mercadal playing amateurs rhythms’ Mr. Serrano Sanz explains.
The experiment – which will be presented in a mini-documentary to raise awareness about citizen science – is expected to help understand other innovation processes observed in emergent social, technological, economic or political transformations. ‘This kind of event brings together a really diverse set of participants. The diversity does not only enrich the data; it improves the dialogue between professionals and volunteers. As a result, we see some new and innovative approaches to research.’
The EUR 0.7 million project brings together 6 partners from 4 countries: Spain (University of Zaragoza and TECNARA), Portugal (Museu da Ciência-Coimbra, MUSC ; Universidade de Coimbra),  Austria (Zentrum für Soziale Innovation) and Brazil (Universidade Federal de Campina Grande, UFCG).
SOCIENTIZE will end in October 2104 after bringing together 12000 citizens in different phases of research activities for 24 months.”

Selected Readings on Crowdsourcing Tasks and Peer Production


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing was originally published in 2014.

Technological advances are creating a new paradigm by which institutions and organizations are increasingly outsourcing tasks to an open community, allocating specific needs to a flexible, willing and dispersed workforce. “Microtasking” platforms like Amazon’s Mechanical Turk are a burgeoning source of income for individuals who contribute their time, skills and knowledge on a per-task basis. In parallel, citizen science projects – task-based initiatives in which citizens of any background can help contribute to scientific research – like Galaxy Zoo are demonstrating the ability of lay and expert citizens alike to make small, useful contributions to aid large, complex undertakings. As governing institutions seek to do more with less, looking to the success of citizen science and microtasking initiatives could provide a blueprint for engaging citizens to help accomplish difficult, time-consuming objectives at little cost. Moreover, the incredible success of peer-production projects – best exemplified by Wikipedia – instills optimism regarding the public’s willingness and ability to complete relatively small tasks that feed into a greater whole and benefit the public good. You can learn more about this new wave of “collective intelligence” by following the MIT Center for Collective Intelligence and their annual Collective Intelligence Conference.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Benkler, Yochai. The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press, 2006. http://bit.ly/1aaU7Yb.

  • In this book, Benkler “describes how patterns of information, knowledge, and cultural production are changing – and shows that the way information and knowledge are made available can either limit or enlarge the ways people can create and express themselves.”
  • In his discussion on Wikipedia – one of many paradigmatic examples of people collaborating without financial reward – he calls attention to the notable ongoing cooperation taking place among a diversity of individuals. He argues that, “The important point is that Wikipedia requires not only mechanical cooperation among people, but a commitment to a particular style of writing and describing concepts that is far from intuitive or natural to people. It requires self-discipline. It enforces the behavior it requires primarily through appeal to the common enterprise that the participants are engaged in…”

Brabham, Daren C. Using Crowdsourcing in Government. Collaborating Across Boundaries Series. IBM Center for The Business of Government, 2013. http://bit.ly/17gzBTA.

  • In this report, Brabham categorizes government crowdsourcing cases into a “four-part, problem-based typology, encouraging government leaders and public administrators to consider these open problem-solving techniques as a way to engage the public and tackle difficult policy and administrative tasks more effectively and efficiently using online communities.”
  • The proposed four-part typology describes the following types of crowdsourcing in government:
    • Knowledge Discovery and Management
    • Distributed Human Intelligence Tasking
    • Broadcast Search
    • Peer-Vetted Creative Production
  • In his discussion on Distributed Human Intelligence Tasking, Brabham argues that Amazon’s Mechanical Turk and other microtasking platforms could be useful in a number of governance scenarios, including:
    • Governments and scholars transcribing historical document scans
    • Public health departments translating health campaign materials into foreign languages to benefit constituents who do not speak the native language
    • Governments translating tax documents, school enrollment and immunization brochures, and other important materials into minority languages
    • Helping governments predict citizens’ behavior, “such as for predicting their use of public transit or other services or for predicting behaviors that could inform public health practitioners and environmental policy makers”

Boudreau, Kevin J., Patrick Gaule, Karim Lakhani, Christoph Reidl, Anita Williams Woolley. “From Crowds to Collaborators: Initiating Effort & Catalyzing Interactions Among Online Creative Workers.” Harvard Business School Technology & Operations Mgt. Unit Working Paper No. 14-060. January 23, 2014. https://bit.ly/2QVmGUu.

  • In this working paper, the authors explore the “conditions necessary for eliciting effort from those affecting the quality of interdependent teamwork” and “consider the the role of incentives versus social processes in catalyzing collaboration.”
  • The paper’s findings are based on an experiment involving 260 individuals randomly assigned to 52 teams working toward solutions to a complex problem.
  • The authors determined the level of effort in such collaborative undertakings are sensitive to cash incentives. However, collaboration among teams was driven more by the active participation of teammates, rather than any monetary reward.

Franzoni, Chiara, and Henry Sauermann. “Crowd Science: The Organization of Scientific Research in Open Collaborative Projects.” Research Policy (August 14, 2013). http://bit.ly/HihFyj.

  • In this paper, the authors explore the concept of crowd science, which they define based on two important features: “participation in a project is open to a wide base of potential contributors, and intermediate inputs such as data or problem solving algorithms are made openly available.” The rationale for their study and conceptual framework is the “growing attention from the scientific community, but also policy makers, funding agencies and managers who seek to evaluate its potential benefits and challenges. Based on the experiences of early crowd science projects, the opportunities are considerable.”
  • Based on the study of a number of crowd science projects – including governance-related initiatives like Patients Like Me – the authors identify a number of potential benefits in the following categories:
    • Knowledge-related benefits
    • Benefits from open participation
    • Benefits from the open disclosure of intermediate inputs
    • Motivational benefits
  • The authors also identify a number of challenges:
    • Organizational challenges
    • Matching projects and people
    • Division of labor and integration of contributions
    • Project leadership
    • Motivational challenges
    • Sustaining contributor involvement
    • Supporting a broader set of motivations
    • Reconciling conflicting motivations

Kittur, Aniket, Ed H. Chi, and Bongwon Suh. “Crowdsourcing User Studies with Mechanical Turk.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 453–456. CHI ’08. New York, NY, USA: ACM, 2008. http://bit.ly/1a3Op48.

  • In this paper, the authors examine “[m]icro-task markets, such as Amazon’s Mechanical Turk, [which] offer a potential paradigm for engaging a large number of users for low time and monetary costs. [They] investigate the utility of a micro-task market for collecting user measurements, and discuss design considerations for developing remote micro user evaluation tasks.”
  • The authors conclude that in addition to providing a means for crowdsourcing small, clearly defined, often non-skill-intensive tasks, “Micro-task markets such as Amazon’s Mechanical Turk are promising platforms for conducting a variety of user study tasks, ranging from surveys to rapid prototyping to quantitative measures. Hundreds of users can be recruited for highly interactive tasks for marginal costs within a timeframe of days or even minutes. However, special care must be taken in the design of the task, especially for user measurements that are subjective or qualitative.”

Kittur, Aniket, Jeffrey V. Nickerson, Michael S. Bernstein, Elizabeth M. Gerber, Aaron Shaw, John Zimmerman, Matthew Lease, and John J. Horton. “The Future of Crowd Work.” In 16th ACM Conference on Computer Supported Cooperative Work (CSCW 2013), 2012. http://bit.ly/1c1GJD3.

  • In this paper, the authors discuss paid crowd work, which “offers remarkable opportunities for improving productivity, social mobility, and the global economy by engaging a geographically distributed workforce to complete complex tasks on demand and at scale.” However, they caution that, “it is also possible that crowd work will fail to achieve its potential, focusing on assembly-line piecework.”
  • The authors argue that seven key challenges must be met to ensure that crowd work processes evolve and reach their full potential:
    • Designing workflows
    • Assigning tasks
    • Supporting hierarchical structure
    • Enabling real-time crowd work
    • Supporting synchronous collaboration
    • Controlling quality

Madison, Michael J. “Commons at the Intersection of Peer Production, Citizen Science, and Big Data: Galaxy Zoo.” In Convening Cultural Commons, 2013. http://bit.ly/1ih9Xzm.

  • This paper explores a “case of commons governance grounded in research in modern astronomy. The case, Galaxy Zoo, is a leading example of at least three different contemporary phenomena. In the first place, Galaxy Zoo is a global citizen science project, in which volunteer non-scientists have been recruited to participate in large-scale data analysis on the Internet. In the second place, Galaxy Zoo is a highly successful example of peer production, some times known as crowdsourcing…In the third place, is a highly visible example of data-intensive science, sometimes referred to as e-science or Big Data science, by which scientific researchers develop methods to grapple with the massive volumes of digital data now available to them via modern sensing and imaging technologies.”
  • Madison concludes that the success of Galaxy Zoo has not been the result of the “character of its information resources (scientific data) and rules regarding their usage,” but rather, the fact that the “community was guided from the outset by a vision of a specific organizational solution to a specific research problem in astronomy, initiated and governed, over time, by professional astronomers in collaboration with their expanding universe of volunteers.”

Malone, Thomas W., Robert Laubacher and Chrysanthos Dellarocas. “Harnessing Crowds: Mapping the Genome of Collective Intelligence.” MIT Sloan Research Paper. February 3, 2009. https://bit.ly/2SPjxTP.

  • In this article, the authors describe and map the phenomenon of collective intelligence – also referred to as “radical decentralization, crowd-sourcing, wisdom of crowds, peer production, and wikinomics – which they broadly define as “groups of individuals doing things collectively that seem intelligent.”
  • The article is derived from the authors’ work at MIT’s Center for Collective Intelligence, where they gathered nearly 250 examples of Web-enabled collective intelligence. To map the building blocks or “genes” of collective intelligence, the authors used two pairs of related questions:
    • Who is performing the task? Why are they doing it?
    • What is being accomplished? How is it being done?
  • The authors concede that much work remains to be done “to identify all the different genes for collective intelligence, the conditions under which these genes are useful, and the constraints governing how they can be combined,” but they believe that their framework provides a useful start and gives managers and other institutional decisionmakers looking to take advantage of collective intelligence activities the ability to “systematically consider many possible combinations of answers to questions about Who, Why, What, and How.”

Mulgan, Geoff. “True Collective Intelligence? A Sketch of a Possible New Field.” Philosophy & Technology 27, no. 1. March 2014. http://bit.ly/1p3YSdd.

  • In this paper, Mulgan explores the concept of a collective intelligence, a “much talked about but…very underdeveloped” field.
  • With a particular focus on health knowledge, Mulgan “sets out some of the potential theoretical building blocks, suggests an experimental and research agenda, shows how it could be analysed within an organisation or business sector and points to possible intellectual barriers to progress.”
  • He concludes that the “central message that comes from observing real intelligence is that intelligence has to be for something,” and that “turning this simple insight – the stuff of so many science fiction stories – into new theories, new technologies and new applications looks set to be one of the most exciting prospects of the next few years and may help give shape to a new discipline that helps us to be collectively intelligent about our own collective intelligence.”

Sauermann, Henry and Chiara Franzoni. “Participation Dynamics in Crowd-Based Knowledge Production: The Scope and Sustainability of Interest-Based Motivation.” SSRN Working Papers Series. November 28, 2013. http://bit.ly/1o6YB7f.

  • In this paper, Sauremann and Franzoni explore the issue of interest-based motivation in crowd-based knowledge production – in particular the use of the crowd science platform Zooniverse – by drawing on “research in psychology to discuss important static and dynamic features of interest and deriv[ing] a number of research questions.”
  • The authors find that interest-based motivation is often tied to a “particular object (e.g., task, project, topic)” not based on a “general trait of the person or a general characteristic of the object.” As such, they find that “most members of the installed base of users on the platform do not sign up for multiple projects, and most of those who try out a project do not return.”
  • They conclude that “interest can be a powerful motivator of individuals’ contributions to crowd-based knowledge production…However, both the scope and sustainability of this interest appear to be rather limited for the large majority of contributors…At the same time, some individuals show a strong and more enduring interest to participate both within and across projects, and these contributors are ultimately responsible for much of what crowd science projects are able to accomplish.”

Schmitt-Sands, Catherine E. and Richard J. Smith. “Prospects for Online Crowdsourcing of Social Science Research Tasks: A Case Study Using Amazon Mechanical Turk.” SSRN Working Papers Series. January 9, 2014. http://bit.ly/1ugaYja.

  • In this paper, the authors describe an experiment involving the nascent use of Amazon’s Mechanical Turk as a social science research tool. “While researchers have used crowdsourcing to find research subjects or classify texts, [they] used Mechanical Turk to conduct a policy scan of local government websites.”
  • Schmitt-Sands and Smith found that “crowdsourcing worked well for conducting an online policy program and scan.” The microtasked workers were helpful in screening out local governments that either did not have websites or did not have the types of policies and services for which the researchers were looking. However, “if the task is complicated such that it requires ongoing supervision, then crowdsourcing is not the best solution.”

Shirky, Clay. Here Comes Everybody: The Power of Organizing Without Organizations. New York: Penguin Press, 2008. https://bit.ly/2QysNif.

  • In this book, Shirky explores our current era in which, “For the first time in history, the tools for cooperating on a global scale are not solely in the hands of governments or institutions. The spread of the Internet and mobile phones are changing how people come together and get things done.”
  • Discussing Wikipedia’s “spontaneous division of labor,” Shirky argues that the process is like, “the process is more like creating a coral reef, the sum of millions of individual actions, than creating a car. And the key to creating those individual actions is to hand as much freedom as possible to the average user.”

Silvertown, Jonathan. “A New Dawn for Citizen Science.” Trends in Ecology & Evolution 24, no. 9 (September 2009): 467–471. http://bit.ly/1iha6CR.

  • This article discusses the move from “Science for the people,” a slogan adopted by activists in the 1970s to “’Science by the people,’ which is “a more inclusive aim, and is becoming a distinctly 21st century phenomenon.”
  • Silvertown identifies three factors that are responsible for the explosion of activity in citizen science, each of which could be similarly related to the crowdsourcing of skills by governing institutions:
    • “First is the existence of easily available technical tools for disseminating information about products and gathering data from the public.
    • A second factor driving the growth of citizen science is the increasing realisation among professional scientists that the public represent a free source of labour, skills, computational power and even finance.
    • Third, citizen science is likely to benefit from the condition that research funders such as the National Science Foundation in the USA and the Natural Environment Research Council in the UK now impose upon every grantholder to undertake project-related science outreach. This is outreach as a form of public accountability.”

Szkuta, Katarzyna, Roberto Pizzicannella, David Osimo. “Collaborative approaches to public sector innovation: A scoping study.” Telecommunications Policy. 2014. http://bit.ly/1oBg9GY.

  • In this article, the authors explore cases where government collaboratively delivers online public services, with a focus on success factors and “incentives for services providers, citizens as users and public administration.”
  • The authors focus on six types of collaborative governance projects:
    • Services initiated by government built on government data;
    • Services initiated by government and making use of citizens’ data;
    • Services initiated by civil society built on open government data;
    • Collaborative e-government services; and
    • Services run by civil society and based on citizen data.
  • The cases explored “are all designed in the way that effectively harnesses the citizens’ potential. Services susceptible to collaboration are those that require computing efforts, i.e. many non-complicated tasks (e.g. citizen science projects – Zooniverse) or citizens’ free time in general (e.g. time banks). Those services also profit from unique citizens’ skills and their propensity to share their competencies.”

CollaborativeScience.org: Sustaining Ecological Communities Through Citizen Science and Online Collaboration


David Mellor at CommonsLab: “In any endeavor, there can be a tradeoff between intimacy and impact. The same is true for science in general and citizen science in particular. Large projects with thousands of collaborators can have incredible impact and robust, global implications. On the other hand, locally based projects can foster close-knit ties that encourage collaboration and learning, but face an uphill battle when it comes to creating rigorous and broadly relevant investigations. Online collaboration has the potential to harness the strengths of both of these strategies if a space can be created that allows for the easy sharing of complex ideas and conservation strategies.
CollaborativeScience.org was created by researchers from five different universities to train Master Naturalists in ecology, scientific modeling and adaptive management, and then give these capable volunteers a space to put their training to work and create conservation plans in collaboration with researchers and land managers.
We are focusing on scientific modeling throughout this process because environmental managers and ecologists have been trained to intuitively create explanations based on a very large number of related observations. As new data are collected, these explanations are revised and are put to use in generating new, testable hypotheses. The modeling tools that we are providing to our volunteers allow them to formalize this scientific reasoning by adding information, sources and connections, then making predictions based on possible changes to the system. We integrate their projects into the well-established citizen science tools at CitSci.org and guide them through the creation of an adaptive management plan, a proven conservation project framework…”

The Weird, Wild World of Citizen Science Is Already Here


David Lang in Wired: “Up and down the west coast of North America, countless numbers of starfish are dying. The affliction, known as Sea Star Wasting Syndrome, is already being called the biggest die-off of sea stars in recorded history, and we’re still in the dark as to what’s causing it or what it means. It remains an unsolved scientific mystery. The situation is also shaping up as a case study of an unsung scientific opportunity: the rise of citizen science and exploration.
The sea star condition was first noticed by Laura James, a diver and underwater videographer based in Seattle. As they began washing up on the shore near her home with lesions and missing limbs, she became concerned and notified scientists. Similar sightings started cropping up all along the West Coast, with gruesome descriptions of sea stars that were disintegrating in a matter of days, and populations that had been decimated. As scientists race to understand what’s happening, they’ve enlisted the help of amateurs like James, to move faster. Pete Raimondi’s lab at UC Santa Cruz has created the Sea Star Wasting Map, the baseline for monitoring the issue, to capture the diverse set of contributors and collaborators.
The map is one of many new models of citizen-powered science–a blend of amateurs and professionals, looking and learning together–that are beginning to emerge. Just this week, NASA endorsed a group of amateur astronomers to attempt to rescue a vintage U.S. spacecraft. NASA doesn’t have the money to do it, and this passionate group of citizen scientists can handle it.
Unfortunately, the term “citizen science” is terrible. It’s vague enough to be confusing, yet specific enough to seem exclusive. It’s too bad, too, because the idea of citizen science is thrilling. I love the notion that I can participate in the expanding pool of human knowledge and understanding, even though the extent of my formal science education is a high school biology class. To me, it seemed a genuine invitation to be curious. A safe haven for beginners. A license to explore.
Not everyone shares my romantic perspective, though. If you ask a university researcher, they’re likely to explain citizen science as a way for the public to contribute data points to larger, professionally run studies, like participating in the galaxy-spotting website Zooniverse or taking part in the annual Christmas Bird Count with the Audubon Society. It’s a model on the scientific fringes; using broad participation to fill the gaps in necessary data.
There’s power in this diffuse definition, though, as long as new interpretations are welcomed and encouraged. By inviting and inspiring people to ask their own questions, citizen science can become much more than a way of measuring bird populations. From the drone-wielding conservationists in South Africa to the makeshift biolabs in Brooklyn, a widening circle of participants are wearing the amateur badge with honor. And all of these groups–the makers, the scientists, the hobbyists–are converging to create a new model for discovery. In other words, the maker movement and the traditional science world are on a collision course.
To understand the intersection, it helps to know where each of those groups is coming from….”

Paying Farmers to Welcome Birds


Jim Robbins in The New York Times: “The Central Valley was once one of North America’s most productive wildlife habitats, a 450-mile-long expanse marbled with meandering streams and lush wetlands that provided an ideal stop for migratory shorebirds on their annual journeys from South America and Mexico to the Arctic and back.

Farmers and engineers have long since tamed the valley. Of the wetlands that existed before the valley was settled, about 95 percent are gone, and the number of migratory birds has declined drastically. But now an unusual alliance of conservationists, bird watchers and farmers have joined in an innovative plan to restore essential habitat for the migrating birds.

The program, called BirdReturns, starts with data from eBird, the pioneering citizen science project that asks birders to record sightings on a smartphone app and send the information to the Cornell Lab of Ornithology in upstate New York.

By crunching data from the Central Valley, eBird can generate maps showing where virtually every species congregates in the remaining wetlands. Then, by overlaying those maps on aerial views of existing surface water, it can determine where the birds’ need for habitat is greatest….

BirdReturns is an example of the growing movement called reconciliation ecology, in which ecosystems dominated by humans are managed to increase biodiversity.

“It’s a new ‘Moneyball,’ ” said Eric Hallstein, an economist with the Nature Conservancy and a designer of the auctions, referring to the book and movie about the Oakland Athletics’ data-driven approach to baseball. “We’re disrupting the conservation industry by taking a new kind of data, crunching it differently and contracting differently.”