Crowdsourcing and Humanitarian Action: Analysis of the Literature


Patrick Meier:  “Raphael Hörler from Zurich’s ETH University has just completed his thesis on the role of crowdsourcing in humanitarian action. His valuable research offers one of the most up-to-date and comprehensive reviews of the principal players and humanitarian technologies in action today. In short, I highly recommend this important resource. Raphael’s full thesis is available here (PDF).”

Look to Government—Yes, Government—for New Social Innovations


Paper by Christian Bason and Philip Colligan: “If asked to identify the hotbed of social innovation right now, many people would likely point to the new philanthropy of Silicon Valley or the social entrepreneurship efforts supported by Ashoka, Echoing Green, and Skoll Foundation. Very few people, if any, would mention their state capital or Capitol Hill. While local and national governments may have promulgated some of the greatest advances in human history — from public education to putting a man on the moon — public bureaucracies are more commonly known to stifle innovation.
Yet, around the world, there are local, regional, and national government innovators who are challenging this paradigm. They are pioneering a new form of experimental government — bringing new knowledge and practices to the craft of governing and policy making; drawing on human-centered design, user engagement, open innovation, and cross-sector collaboration; and using data, evidence, and insights in new ways.
Earlier this year, Nesta, the UK’s innovation foundation (which Philip helps run), teamed up with Bloomberg Philanthropies to publish i-teams, the first global review of public innovation teams set up by national and city governments. The study profiled 20 of the most established i-teams from around the world, including:

  • French Experimental Fund for Youth, which has supported more than 554 experimental projects (such as one that reduces school drop-out rates) that have benefited over 480,000 young people;
  • Nesta’s Innovation Lab, which has run 70 open innovation challenges and programs supporting over 750 innovators working in fields as diverse as energy efficiency, healthcare, and digital education;
  • New Orleans’ Innovation and Delivery team, which achieved a 19% reduction in the number of murders in the city in 2013 compared to the previous year.

How are i-teams achieving these results? The most effective ones are explicit about the goal they seek – be it creating a solution to a specific policy challenge, engaging citizenry in behaviors that help the commonweal, or transforming the way government behaves. Importantly, these teams are also able to deploy the right skills, capabilities, and methods for the job.
In addition, ­i-teams have a strong bias toward action. They apply academic research in behavioral economics and psychology to public policy and services, focusing on rapid experimentation and iteration. The approach stands in stark contrast to the normal routines of government.
Take for example, The UK’s Behavioural Insights Team (BIT), often called the Nudge Unit. It sets clear goals, engages the right expertise to prototype means to the end, and tests innovations rapidly in the field, to learn what’s not working and rapidly scales what is.
One of BIT’s most famous projects changed taxpayer behavior. BIT’s team of economists, behavioral psychologists, and seasoned government staffers came up with minor changes to tax letters, sent out by the UK Government, that subtlety introduced positive peer pressure. By simply altering the letters to say that most people in their local area had already paid their taxes, BIT was able to boost repayment rates by around 5%. This trial was part of a range of interventions, which have helped forward over £200 million in additional tax revenue to HM Revenue & Customs, the UK’s tax authority.
The Danish government’s internal i-team, MindLab (which Christian ran for 8 years) has likewise influenced citizen behavior….”

We’re All Pirates Now


Book Review by Edward Kosner of “Information Doesn’t Want to Be Free”in the Wall Street Journal: “Do you feel like a thief when you click on a website link and find yourself reading an article or listening to a song you haven’t paid for? Should you? Are you annoyed when you can’t copy a movie you’ve paid for onto your computer’s hard drive? Should you be? Should copyright, conceived in England three centuries ago to protect writers from unscrupulous printers, apply the same way to creators and consumers in the digital age?
The sci-fi writer, blogger and general man-about-the-Web Cory Doctorow tries to answer some of these questions—and introduces others—in “Information Doesn’t Want to Be Free.” Billed as a guide for perplexed creators about how to make a living in the Internet Era, the book is actually a populist manifesto for the information revolution.
Mr. Doctorow is a confident and aphoristic writer—his book is like one long TED talk—and his basic advice to creators is easy to grasp: Aspiring novelists, journalists, musicians and other artists and would-be artists should recognize the Web as an unprecedented promotional medium rather than a revenue source. Creators, writes Mr. Doctorow, need to get known before they can expect to profit from their work. So they should welcome having their words, music or images reproduced online without permission to pave the way for a later payoff.
Even if they manage to make a name, he warns, they’re likely to be ripped off by the entertainment-industrial complex—big book publishers, record companies, movie studios, Google , Apple and Microsoft. But they can monetize their creativity by, among other things, selling tickets to public shows, peddling “swag”—T-shirts, ball caps, posters and recordings—and taking commissions for new work.

He cites the example of a painter named Molly Crabapple, who, inspired by the Occupy Wall Street movement, raised $55,000 on the crowdsourcing site Kickstarter, rented a storefront and created nine huge canvases, seven of which she sold for $8,000 each. Not the easiest way to become the next Jeff Koons, Taylor Swift or Gillian Flynn.
But Mr. Doctorow turns out to be less interested in mentoring unrealized talent than in promulgating a new regime for copyright regulation on the Internet. Copyright has been enshrined in American law since 1790, but computer technology, he argues, has rendered the concept obsolete: “We can’t stop copying on the Internet because the Internet is a copying machine.” And the whole debate, he complains, “is filled with lies, damn lies and piracy statistics.”
There’s lots of technical stuff here about digital locks—he calls devices like the Kindle “roach motels” that allow content to be loaded but never offloaded elsewhere—as well as algorithms, embedded keys and such. And the book is clotted with acronyms: A diligent reader who finishes this slim volume should be able to pass a test on the meaning of ACTA, WIPO, WPPT, WCT, DMCA, DNS, SOPA and PIPA, not to mention NaTD (techspeak for “Notice and Take Down”).
The gist of Mr. Doctorow’s argument is that the bad guys of the content game use copyright protection and antipiracy protocols not to help creators but to enrich themselves at the expense of the talent and the consumers of content. Similarly, he contends that the crusade against “net neutrality”—the principle that Internet carriers must treat all data and users the same way—is actually a ploy to elevate big players in the digital world by turning the rest of us into second-class Netizens.
“The future of the Internet,” he writes, “should not be a fight about whether Google (or Apple or Microsoft) gets to be in charge or whether Hollywood gets to be in charge. Left to their own devices, Big Tech and Big Content are perfectly capable of coming up with a position that keeps both ‘sides’ happy at the expense of everyone else.”…”

Measuring the Impact of Public Innovation in the Wild


Beth Noveck at Governing: “With complex, seemingly intractable problems such as inequality, climate change and affordable access to health care plaguing contemporary society, traditional institutions such as government agencies and nonprofit organizations often lack strategies for tackling them effectively and legitimately. For this reason, this year the MacArthur Foundation launched its Research Network on Opening Governance.
The Network, which I chair and which also is supported by Google.org, is what MacArthur calls a “research institution without walls.” It brings together a dozen researchers across universities and disciplines, with an advisory network of academics, technologists, and current and former government officials, to study new ways of addressing public problems using advances in science and technology.
Through regular meetings and collaborative projects, the Network is exploring, for example, the latest techniques for more open and transparent decision-making, the uses of data to transform how we govern, and the identification of an individual’s skills and experiences to improve collaborative problem-solving between government and citizen.
One of the central questions we are grappling with is how to accelerate the pace of research so we can learn better and faster when an innovation in governance works — for whom, in which contexts and under which conditions. With better methods for doing fast-cycle research in collaboration with government — in the wild, not in the lab — our hope is to be able to predict with accuracy, not just know after the fact, whether innovations such as opening up an agency’s data or consulting with citizens using a crowdsourcing platform are likely to result in real improvements in people’s lives.
An example of such an experiment is the work that members of the Network are undertaking with the Food and Drug Administration. As one of its duties, the FDA manages the process of pre-market approval of medical devices to ensure that patients and providers have timely access to safe, effective and high-quality technology, as well as the post-market review of medical devices to ensure that unsafe ones are identified and recalled from the market. In both of these contexts, the FDA seeks to provide the medical-device industry with productive, consistent, transparent and efficient regulatory pathways.
With thousands of devices, many of them employing cutting-edge technology, to examine each year, the FDA is faced with the challenge of finding the right internal and external expertise to help it quickly study a device’s safety and efficacy. Done right, lives can be saved and companies can prosper from bringing innovations quickly to market. Done wrong, bad devices can kill…”

Code of Conduct: Cyber Crowdsourcing for Good


Patrick Meier at iRevolution: “There is currently no unified code of conduct for digital crowdsourcing efforts in the development, humanitarian or human rights space. As such, we propose the following principles (displayed below) as a way to catalyze a conversation on these issues and to improve and/or expand this Code of Conduct as appropriate.
This initial draft was put together by Kate ChapmanBrooke Simons and myself. The link above points to this open, editable Google Doc. So please feel free to contribute your thoughts by inserting comments where appropriate. Thank you.
An organization that launches a digital crowdsourcing project must:

  • Provide clear volunteer guidelines on how to participate in the project so that volunteers are able to contribute meaningfully.
  • Test their crowdsourcing platform prior to any project or pilot to ensure that the system will not crash due to obvious bugs.
  • Disclose the purpose of the project, exactly which entities will be using and/or have access to the resulting data, to what end exactly, over what period of time and what the expected impact of the project is likely to be.
  • Disclose whether volunteer contributions to the project will or may be used as training data in subsequent machine learning research
  • ….

An organization that launches a digital crowdsourcing project should:

  • Share as much of the resulting data with volunteers as possible without violating data privacy or the principle of Do No Harm.
  • Enable volunteers to opt out of having their tasks contribute to subsequent machine learning research. Provide digital volunteers with the option of having their contributions withheld from subsequent machine learning studies
  • … “

Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems


New paper by Charles D Borromeo, Titus K Schleyer, Michael J Becich, and Harry Hochheiser: “Background: Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs.
Objective: The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype.
Methods: Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS).
Results: Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified.
Conclusions: Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the implications of collaborator search tools for researcher workflows.”

The New Thing in Google Flu Trends Is Traditional Data


in the New York Times: “Google is giving its Flu Trends service an overhaul — “a brand new engine,” as it announced in a blog post on Friday.

The new thing is actually traditional data from the Centers for Disease Control and Prevention that is being integrated into the Google flu-tracking model. The goal is greater accuracy after the Google service had been criticized for consistently over-estimating flu outbreaks in recent years.

The main critique came in an analysis done by four quantitative social scientists, published earlier this year in an article in Science magazine, “The Parable of Google Flu: Traps in Big Data Analysis.” The researchers found that the most accurate flu predictor was a data mash-up that combined Google Flu Trends, which monitored flu-related search terms, with the official C.D.C. reports from doctors on influenza-like illness.

The Google Flu Trends team is heeding that advice. In the blog post, written by Christian Stefansen, a Google senior software engineer, wrote, “We’re launching a new Flu Trends model in the United States that — like many of the best performing methods in the literature — takes official CDC flu data into account as the flu season progresses.”

Google’s flu-tracking service has had its ups and downs. Its triumph came in 2009, when it gave an advance signal of the severity of the H1N1 outbreak, two weeks or so ahead of official statistics. In a 2009 article in Nature explaining how Google Flu Trends worked, the company’s researchers did, as the Friday post notes, say that the Google service was not intended to replace official flu surveillance methods and that it was susceptible to “false alerts” — anything that might prompt a surge in flu-related search queries.

Yet those caveats came a couple of pages into the Nature article. And Google Flu Trends became a symbol of the superiority of the new, big data approach — computer algorithms mining data trails for collective intelligence in real time. To enthusiasts, it seemed so superior to the antiquated method of collecting health data that involved doctors talking to patients, inspecting them and filing reports.

But Google’s flu service greatly overestimated the number of cases in the United States in the 2012-13 flu season — a well-known miss — and, according to the research published this year, has persistently overstated flu cases over the years. In the Science article, the social scientists called it “big data hubris.”

NASA Launches New Citizen Science Website


 

NASASolveRobert McNamara  at Commons Lab:
 
NASASolve debuted last month as a one-stop-shop for prizes and challenges that are seeking contributions from people like you. Don’t worry you need not be a rocket scientist to apply. The general public is encouraged to contribute to solving a variety of challenges facing NASA in reaching its mission goals. From hunting asteroids to re-designing Balance Mass for the Mars Lander, there are multitudes of ways for you to be a part of the nation’s space program.
Crowdsourcing the public for innovative solutions is something that NASA has been engaged in since 2005.  But as NASA’s chief technologist points out, “NASASolve is a great way for members of the public and other citizen scientists to see all NASA prizes and challenges in one location.”  The new site hopes to build on past successes like the Astronaut Glove Challenge, the ISS Longeron Challenge and the Zero Robotics Video Challenge. “Challenges are one tool to tap the top talent and best ideas. Partnering with the community to get ideas and solutions is important for NASA moving forward,” says Jennifer Gustetic, Program Executive of NASA Prizes and Challenges.
In order to encourage more active public participation, millions of dollars and scholarships have been set aside to reward those whose ideas and solutions succeed in taking on NASA’s challenges. If you want to get involved, visit NASASolve for more information and the current list of challenges waiting for solutions….

ShareHub: at the Heart of Seoul's Sharing Movement


Cat Johnson at Shareable: “In 2012, Seoul publicly announced its commitment to becoming a sharing city. It has since emerged as a leader of the global sharing movement and serves as a model for cities around the world. Supported by the municipal government and embedded in numerous parts of everyday life in Seoul, the Sharing City project has proven to be an inspiration to city leaders, entrepreneurs, and sharing enthusiasts around the world.
At the heart of Sharing City, Seoul is ShareHub, an online platform that connects users with sharing services, educates and informs the public about sharing initiatives, and serves as the online hub for the Sharing City, Seoul project. Now a year and a half into its existence, ShareHub, which is powered by Creative Commons Korea (CC Korea), has served 1.4 million visitors since launching, hosts more than 350 articles about sharing, and has played a key role in promoting sharing policies and projects. Shareable connected with Nanshil Kwon, manager of ShareHub, to find out more about the project, its role in promoting sharing culture, and the future of the sharing movement in Seoul….”

When Experts Are a Waste of Money


Vivek Wadhwa at the Wall Street Journal: “Corporations have always relied on industry analysts, management consultants and in-house gurus for advice on strategy and competitiveness. Since these experts understand the products, markets and industry trends, they also get paid the big bucks.
But what experts do is analyze historical trends, extrapolate forward on a linear basis and protect the status quo — their field of expertise. And technologies are not progressing linearly anymore; they are advancing exponentially. Technology is advancing so rapidly that listening to people who just have domain knowledge and vested interests will put a company on the fastest path to failure. Experts are no longer the right people to turn to; they are a waste of money.
Just as the processing power of our computers doubles every 18 months, with prices falling and devices becoming smaller, fields such as medicine, robotics, artificial intelligence and synthetic biology are seeing accelerated change. Competition now comes from the places you least expect it to. The health-care industry, for example, is about to be disrupted by advances in sensors and artificial intelligence; lodging and transportation, by mobile apps; communications, by Wi-Fi and the Internet; and manufacturing, by robotics and 3-D printing.
To see the competition coming and develop strategies for survival, companies now need armies of people, not experts. The best knowledge comes from employees, customers and outside observers who aren’t constrained by their expertise or personal agendas. It is they who can best identify the new opportunities. The collective insight of large numbers of individuals is superior because of the diversity of ideas and breadth of knowledge that they bring. Companies need to learn from people with different skills and backgrounds — not from those confined to a department.
When used properly, crowdsourcing can be the most effective, least expensive way of solving problems.
Crowdsourcing can be as simple as asking employees to submit ideas via email or via online discussion boards, or it can assemble cross-disciplinary groups to exchange ideas and brainstorm. Internet platforms such as Zoho Connect, IdeaScale and GroupTie can facilitate group ideation by providing the ability to pose questions to a large number of people and having them discuss responses with each other.
Many of the ideas proposed by the crowd as well as the discussions will seem outlandish — especially if anonymity is allowed on discussion forums. And companies will surely hear things they won’t like. But this is exactly the input and out-of-the-box thinking that they need in order to survive and thrive in this era of exponential technologies….
Another way of harnessing the power of the crowd is to hold incentive competitions. These can solve problems, foster innovation and even create industries — just as the first XPRIZE did. Sponsored by the Ansari family, it offered a prize of $10 million to any team that could build a spacecraft capable of carrying three people to 100 kilometers above the earth’s surface, twice within two weeks. It was won by Burt Rutan in 2004, who launched a spacecraft called SpaceShipOne. Twenty-six teams, from seven countries, spent more than $100 million in competing. Since then, more than $1.5 billion has been invested in private space flight by companies such as Virgin Galactic, Armadillo Aerospace and Blue Origin, according to the XPRIZE Foundation….
Competitions needn’t be so grand. InnoCentive and HeroX, a spinoff from the XPRIZE Foundation, for example, allow prizes as small as a few thousand dollars for solving problems. A company or an individual can specify a problem and offer prizes for whoever comes up with the best idea to solve it. InnoCentive has already run thousands of public and inter-company competitions. The solutions they have crowdsourced have ranged from the development of biomarkers for Amyotrophic lateral sclerosis disease to dual-purpose solar lights for African villages….”