Paper by Joshua C. Gellers in International Environmental Agreements: Politics, Law and Economics: “To what extent can crowdsourcing help members of civil society overcome the democratic deficit in global environmental governance? In this paper, I evaluate the utility of crowdsourcing as a tool for participatory agenda-setting in the realm of post-2015 sustainable development policy. In particular, I analyze the descriptive representativeness (e.g., the degree to which participation mirrors the demographic attributes of non-state actors comprising global civil society) of participants in two United Nations orchestrated crowdsourcing processes—the MY World survey and e-discussions regarding environmental sustainability. I find that there exists a perceptible demographic imbalance among contributors to the MY World survey and considerable dissonance between the characteristics of participants in the e-discussions and those whose voices were included in the resulting summary report. The results suggest that although crowdsourcing may present an attractive technological approach to expand participation in global governance, ultimately the representativeness of that participation and the legitimacy of policy outputs depend on the manner in which contributions are solicited and filtered by international institutions….(More)”
NEW Platform for Sharing Research on Opening Governance: The Open Governance Research Exchange (OGRX)
Andrew Young: “Today, The GovLab, in collaboration with founding partners mySociety and the World Bank’s Digital Engagement Evaluation Team are launching the Open Governance Research Exchange (OGRX), a new platform for sharing research and findings on innovations in governance.
From crowdsourcing to nudges to open data to participatory budgeting, more open and innovative ways to tackle society’s problems and make public institutions more effective are emerging. Yet little is known about what innovations actually work, when, why, for whom and under what conditions.
And anyone seeking existing research is confronted with sources that are widely dispersed across disciplines, often locked behind pay walls, and hard to search because of the absence of established taxonomies. As the demand to confront problems in new ways grows so too does the urgency for making learning about governance innovations more accessible.
As part of GovLab’s broader effort to move from “faith-based interventions” toward more “evidence-based interventions,” OGRX curates and makes accessible the most diverse and up-to-date collection of findings on innovating governance. At launch, the site features over 350 publications spanning a diversity of governance innovation areas, including but not limited to:
- Behavioral Science and Nudges
- Citizen Engagement and Crowdsourcing
- Civic Technology
- Data Analysis
- Expert Networking
- Labs and Experimentation
- Open Data…
Visit ogrx.org to explore the latest research findings, submit your own work for inclusion on the platform, and share knowledge with others interested in using science and technology to improve the way we govern. (More)”
Crowdsourcing Solutions and Crisis Information during the Renaissance
Patrick Meier: “Clearly, crowdsourcing is not new, only the word is. After all, crowdsourcing is a methodology, not a technology nor an industry. Perhaps one of my favorite examples of crowdsourcing during the Renaissance surrounds the invention of the marine chronometer, which completely revolutionized long distance sea travel. Thousands of lives were being lost in shipwrecks because longitude coordinates were virtually impossible to determine in the open seas. Finding a solution this problem became critical as the Age of Sail dawned on many European empires.
So the Spanish King, Dutch Merchants and others turned to crowdsourcing by offering major prize money for a solution. The British government even launched the “Longitude Prize” which was established through an Act of Parliament in 1714 and administered by the “Board of Longitude.” This board brought together the greatest scientific minds of the time to work on the problem, including Sir Isaac Newton. Galileo was also said to have taken up the challenge.
The main prizes included: “£10,000 for a method that could determine longitude within 60 nautical miles (111 km); £15,000 for a method that could determine longitude within 40 nautical miles (74 km); and £20,000 for a method that could determine longitude within 30 nautical miles (56 km).” Note that £20,000 in 1714 is around $4.7 million dollars today. The $1 million Netflix Prize launched 400 years later pales in comparison.” In addition, the Board had the discretion to make awards to persons who were making significant contributions to the effort or to provide financial support to those who were working towards a solution. The Board could also make advances of up to £2,000 for experimental work deemed promising.”
Interestingly, the person who provided the most breakthroughs—and thus received the most prize money—was the son of a carpenter, the self-educated British clockmaker John Harrison. And so, as noted by Peter LaMotte, “by allowing anyone to participate in solving the problem, a solution was found for a puzzle that had baffled some of the brightest minds in history (even Galileo!). In the end, it was found by someone who would never have been tapped to solve it to begin with.”…(More)”
The Evolution of Wikipedia’s Norm Network
Bradi Heaberlin and Simon DeDeo at Future Internet: “Social norms have traditionally been difficult to quantify. In any particular society, their sheer number and complex interdependencies often limit a system-level analysis. One exception is that of the network of norms that sustain the online Wikipedia community. We study the fifteen-year evolution of this network using the interconnected set of pages that establish, describe, and interpret the community’s norms. Despite Wikipedia’s reputation for ad hocgovernance, we find that its normative evolution is highly conservative. The earliest users create norms that both dominate the network and persist over time. These core norms govern both content and interpersonal interactions using abstract principles such as neutrality, verifiability, and assume good faith. As the network grows, norm neighborhoods decouple topologically from each other, while increasing in semantic coherence. Taken together, these results suggest that the evolution of Wikipedia’s norm network is akin to bureaucratic systems that predate the information age….(More)”
Mexico City is crowdsourcing its new constitution using Change.org in a democracy experiment
Ana Campoy at Quartz: “Mexico City just launched a massive experiment in digital democracy. It is asking its nearly 9 million residents to help draft a new constitution through social media. The crowdsourcing exercise is unprecedented in Mexico—and pretty much everywhere else.
as locals are known, can petition for issues to be included in the constitution through Change.org (link inSpanish), and make their case in person if they gather more than 10,000 signatures. They can also annotate proposals by the constitution drafters via PubPub, an editing platform (Spanish) similar to GoogleDocs.
The idea, in the words of the mayor, Miguel Angel Mancera, is to“bestow the constitution project (Spanish) with a democratic,progressive, inclusive, civic and plural character.”
There’s a big catch, however. The constitutional assembly—the body that has the final word on the new city’s basic law—is under no obligation to consider any of the citizen input. And then there are the practical difficulties of collecting and summarizing the myriad of views dispersed throughout one of the world’s largest cities.
That makes Mexico City’s public-consultation experiment a big test for the people’s digital power, one being watched around the world.Fittingly, the idea of crowdsourcing a constitution came about in response to an attempt to limit people power.
Fittingly, the idea of crowdsourcing a constitution came about in response to an attempt to limit people power.
For decades, city officials had fought to get out from under the thumb of the federal government, which had the final word on decisions such as who should be the city’s chief of police. This year, finally, they won a legal change that turns the Distrito Federal (federal district), similar to the US’s District of Columbia, into Ciudad de México (Mexico City), a more autonomous entity, more akin to a state. (Confusingly, it’s just part of the larger urban area also colloquially known as Mexico City, which spills into neighboring states.)
However, trying to retain some control, the Mexican congress decided that only 60% of the delegates to the city’s constitutional assembly would be elected by popular vote. The rest will be assigned by the president, congress, and Mancera, the mayor. Mancera is also the only one who can submit a draft constitution to the assembly.
Mancera’s response was to create a committee of some 30 citizens(Spanish), including politicians, human-rights advocates, journalists,and even a Paralympic gold medalist, to write his draft. He also calledfor the development of mechanisms to gather citizens’ “aspirations,values, and longing for freedom and justice” so they can beincorporated into the final document.
Mexico City didn’t have a lot of examples to draw on, since not a lot ofplaces have experience with crowdsourcing laws. In the US, a few locallawmakers have used Wiki pages and GitHub to draft bills, says MarilynBautista, a lecturer at Stanford Law School who has researched thepractice. Iceland—with a population some 27 times smaller than MexicoCity’s—famously had its citizens contribute to its constitution withinput from social media. The effort failed after the new constitution gotstuck in parliament.
In Mexico City, where many citizens already feel left out, the first bighurdle is to convince them it’s worth participating….
Then comes the task of making sense of the cacophony that will likelyemerge. Some of the input can be very easily organized—the results ofthe survey, for example, are being graphed in real time. But there could be thousands of documents and comments on the Change.org petitionsand the editing platform.
The most elaborate part of the system is PubPub, an open publishing platform similar to Google Docs, which is based on a project originally developed by MIT’s Media Lab. The drafters are supposed to post essays on how to address constitutional issues, and potentially, the constitution draft itself, once there is one. Only they—or whoever they authorize—will be able to reword the original document.
Tag monitors air pollution and never loses charge
Springwise: “The battle to clean up the air of major cities is well underway, with businesses and politicians pledging to help with the pollution issue. We have seen projects using mobile air sensors mounted on pigeons to bring the problem to public attention, and now a new crowdsourcing campaign is attempting to map the UK’s air pollution.
CleanSpace uses a portable, air pollution-sensing tag to track exposure to harmful pollutants in real-time. The tag is connected to an app, which analyzes and combines the data to that of other users in the UK to create an air pollution map.
An interesting part of the CleanSpace Tag’s technology is the fact it never needs to be charged. The startup say the tag is powered by harvesting 2G, 3G, 4G and wifi signals, which keep its small power requirements filled. The app also rewards users for traveling on-foot or by bike, offering them “CleanMiles” that can be exchanged for discounts with the CleanSpace’s partners.
The startup successfully raised more than GBP 100,000 in a crowdfunding campaign last year, and the team has given back GBP 10,000 to their charitable partners this year. …(More)”
Crowdsourcing healthcare costs: Opportunities and challenges for patient centered price transparency
Paper by Zachary F. Meisel, Lauren A. Houdek VonHoltz, and Raina M. Merchant in Healthcare: “Efforts to improve health care price transparency have garnered significant attention from patients, policy makers, and health insurers. In response to increasing consumer demand, state governments, insurance plans, and health care providers are reporting health care prices. However, such data often do not provide consumers with the most salient information: their own actual out-of-pocket cost for medical care. Although untested, crowdsourcing, a mechanism for the public to help answer complex questions, represents a potential solution to the problem of opaque hospital costs. This article explores, the challenges and potential opportunities for crowdsourcing out-of-pocket costs for healthcare consumers….(More)”.
Crowdcrafting
“Crowdcrafting is a web-based service that invites volunteers to contribute to scientific projects developed by citizens, professionals or institutions that need help to solve problems, analyze data or complete challenging tasks that cant be done by machines alone, but require human intelligence. The platform is 100% open source – that is its software is developed and distributed freely – and 100% open-science, making scientific research accessible to everyone.
Crowdcrafting uses PyBossa software: Our open source framework for crowdsourcing projects. Institutions, such as the British Museum, CERN and United Nations (UNITAR), are also PyBossa users.
What is citizen science?
Citizen science is the active contribution of people who are not professional scientists to science. It provides volunteers with the opportunity to contribute intellectually to the research of others, to share resources or tools at their disposal, or even to start their own research projects. Volunteers provide real value to ongoing research while they themselves acquire a better understanding of the scientific method.
Citizen science opens the doors of laboratories and makes science accessible to all. It facilitates a direct conversation between scientists and enthusiasts who wish to contribute to scientific endeavour.
Who and how you can collaborate?
Anyone can create a new project or contribute to an existing project in Crowdcrafting.
All projects start with a simple tutorial explaining how they work and providing all the information required to participate. There is thus no specific knowledge or experience required to complete proposed tasks. All volunteers need is a keen attitude to learn and share science with everyone….(More)”
citizenscience.gov
Wiki-fishing
The Economist: “….Mr Rhoads is a member of a network started by the Alaska Longline Fishermen’s Association (ALFA), which aims to do something about this and to reduce by-catch of sensitive species such as rockfish at the same time. Network fishermen, who numbered only 20 at the project’s start, agreed to share data on where and what they were catching in order to create maps that highlighted areas of high by-catch. Within two years they had reduced accidental rockfish harvest by as much as 20%.
The rockfish mapping project expanded to create detailed maps of the sea floor, pooling data gathered by transducers fixed to the bottoms of boats. By combining thousands of data points as vessels traverse the fishing grounds, these “wikimaps”—created and updated through crowdsourcing—show gravel beds where bottom-dwelling halibut are likely to linger, craggy terrain where rockfish tend to lurk, and outcrops that could snag gear.
Public charts are imprecise, and equipment with the capability to sense this level of detail could cost a fisherman more than $70,000. Skippers join ALFA for as little as $250, invest a couple of thousand dollars in computers and software and enter into an agreement to turn over fishing data and not to share the information outside the network, which now includes 85 fishermen.
Skippers say the project makes them more efficient, better able to find the sort of fish they want and avoid squandering time on lost or tangled gear. It also means fewer hooks in the water and fewer hours at sea to catch the same amount of fish….(More)”