Tired of Being Profiled, a Programmer Turns to Crowdsourcing Cop Reviews


Christopher Moraff at Next City: “…despite the fact that policing is arguably one of the most important and powerful service professions a civilized society can produce, it’s far easier to find out if the plumber you just hired broke someone’s pipe while fixing their toilet than it is to find out if the cop patrolling your neighborhood broke someone’s head while arresting them.
A 31-year-old computer programmer has set out to fix that glitch with a new web-based (and soon to be mobile) crowdsourced rating tool called CopScore that is designed to help communities distinguish police officers who are worthy of praise from those who are not fit to wear the uniform….
CopScore is a work in progress, and, for the time being at least, a one-man show. Hardison does all the coding himself, often working through the night to bring new features online.
Currently in the very early beta stage, the platform works by consolidating information on the service records of individual police officers together with details of their interactions with constituents. The searchable platform includes data gleaned from public sources — such as social media and news articles — cross-referenced with Yelp-style ratings from citizens.

For Hardison, CopScore is as much a personal endeavor as it is a professional one. He says his youthful interest in computer programming — which he took up as a misbehaving fifth-grader under the guiding hand of a concerned teacher — made him the butt of the occassional joke in the predominantly African-American community of North Nashville where he grew up….”(More)

UNDP Eyes Ukraine’s Damaged Buildings With Crowdsourcing, Mobile App


Aida Akl at VOA TECHtonics: “The crisis that plunged east Ukraine into war in November 2013 has damaged or destroyed critical infrastructure and limited access to areas caught up in fighting between Ukraine’s government forces and pro-Russian rebels. In order to assess damage, the United Nations Development Program (UNDP) turned to crowdsourcing to help restore social infrastructure as part of a United Nations, European Union and World Bank Recovery and Peacebuilding Assessment for Eastern Ukraine….
Using an interactive map, ReDonbass, and a mobile app (Android and iOS), people of Donetsk and Lugansk regions can report damaged homes, hospitals, schools, kindergartens or libraries.
A screenshot of UNDP's app and crowdsourcing map for east Ukraine damage assessment. (UNDP)
The easy-to-use interactive tool allows any person with a mobile phone and access to the Internet to download the most accurate data about the building in its location, photographs of the damage, and the status of the recovery phase. After that, the Ukrainian government and international donors will use the data to better plan reconstruction.
Information from the map will contribute to an ongoing Recovery and Peacebuilding Assessment for Eastern Ukraine. UNDP is part of the assessment that brings the United Nations, the European Union and the World Bank Group together to analyze the impact of the conflict and offer recommendations for short-term recovery and peacebuilding over the next two years….The map has also proven to be very useful for the experts from the Ukrainian Government and a recently launched UNDP-Government of Japan project aiming to restore critical infrastructure for social care and services. They [are] using it to identify schools, orphanages, elderly homes, and social services centers that need to be restored and rebuilt first….(More)”.

Crowdsourcing Dilemma


New paper by Victor Naroditskiy, Nicholas R. Jennings, Pascal Van Hentenryck, Manuel Cebrian: “Crowdsourcing offers unprecedented potential for solving tasks efficiently by tapping into the skills of large groups of people. A salient feature of crowdsourcing—its openness of entry—makes it vulnerable to malicious behavior. Such behavior took place in a number of recent popular crowdsourcing competitions. We provide game-theoretic analysis of a fundamental tradeoff between the potential for increased productivity and the possibility of being set back by malicious behavior. Our results show that in crowdsourcing competitions malicious behavior is the norm, not the anomaly—a result contrary to the conventional wisdom in the area. Counterintuitively, making the attacks more costly does not deter them but leads to a less desirable outcome. These findings have cautionary implications for the design of crowdsourcing competitions…(More)”

Schemes used by South Australia to include citizens in policy making


Joshua Chambers at Future Gov Asia: “…South Australia has pioneered a number of innovative methods to try to include its residents in policymaking. …The highest profile participatory programme run by the state government is the Citizens’ Jury initiative, …The Citizens’ Jury takes a randomly selected, representative group of citizens through a process to hear arguments and evidence much like a jury in a trial, before writing an independent report which makes recommendations to government.
There were 37 members of the jury, hearing evidence on Thursday evenings and Saturdays over a five week period. They heard from motorists associations, cycling associations, and all sorts of other interested groups.
They used Basecamp software to ensure that jurors stayed connected when not at meetings, hosting discussions in a private space to consider the evidence they heard. …The jurors prepared 21 recommendations, ranging from decreasing speed in the city to a schools programme…. The Government supports the majority of the recommendations and will investigate the remaining three.
The government has also committed to provide jurors with an update every 6 months on the progress being made in this area.
Lessons and challenges
As would be expected with an innovative new scheme, it hasn’t always been smooth. One lesson learned from the first initiative was that affected agencies need to be engaged in advance, and briefed throughout the process, so that they can prepare their responses and resources. ….
Aside from the Citizens’ Jury, the Government of South Australia is also pioneering other approaches to include citizens in policy making. Fund My Idea is a crowdsourcing site that allows citizens to propose new projects. …(More)”

Social Sensing and Crowdsourcing: the future of connected sensors


Conference Paper by C. Geijer, M. Larsson, M. Stigelid: “Social sensing is becoming an alternative to static sensors. It is a way to crowdsource data collection where sensors can be placed on frequently used objects, such as mobile phones or cars, to gather important information. Increasing availability in technology, such as cheap sensors being added in cell phones, creates an opportunity to build bigger sensor networks that are capable of collecting a larger quantity and more complex data. The purpose of this paper is to highlight problems in the field, as well as their solutions. The focus lies on the use of physical sensors and not on the use of social media to collect data. Research papers were reviewed based on implemented or suggested implementations of social sensing. The discovered problems are contrasted with possible solutions, and used to reflect upon the future of the field. We found issues such as privacy, noise and trustworthiness to be problems when using a distributed network of sensors. Furthermore, we discovered models for determining the accuracy as well as truthfulness of gathered data that can effectively combat these problems. The topic of privacy remains an open-ended problem, since it is based upon ethical considerations that may differ from person to person, but there exists methods for addressing this as well. The reviewed research suggests that social sensing will become more and more useful in the future….(More).”

VoXup


Nesta: “Does your street feel safe? Would you like to change something in your neighbourhood? Is there enough for young people to do?
All basic questions, but how many local councillors have the time to put these issues to their constituents? A new web app aims to make it easier for councillors and council officers to talk to residents – and it’s all based around a series of simple questions.
Now, just a year after VoXup was created in a north London pub, Camden Council is using it to consult residents on its budget proposals.
One of VoXup’s creators, Peter Lewis, hit upon the idea after meeting an MP and being reminded of how hard it can be to get involved in decision-making….

Now VoXup is being used by Camden Council to engage with residents about its spending plans.
“They’ve got to cut a lot of money and they want to know which services people would prioritise,” Lewis explains.
“So we’ve created a custom community, and most popular topics have got about 200 votes. About 650 people have taken part at some level, and it’s only just begun. We’ve seen a lot of activity – of the people who look at the web page, almost half give an opinion on something.”

‘No need for smartphone app’
What does the future hold for VoXup? Lewis, who is working on the project full-time, says one thing the team won’t be doing is building a smartphone app.
“One of the things we thought about doing was creating a mobile app, but that’s been really unnecessary – we built VoXup as a responsive web app,” he says…. (More)”.

Coop’s Citizen Sci Scoop: Try it, you might like it


Response by Caren Cooper at PLOS: “Margaret Mead, the world-famous anthropologist said, “never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.”
The sentiment rings true for citizen science.
Yet, recent news in the citizen science world has been headlined “Most participants in citizen science projects give up almost immediately.” This was based on a study of participation in seven different projects within the crowdsourcing hub called Zooniverse. Most participants tried a project once, very briefly, and never returned.
What’s unusual about Zooniverse projects is not the high turnover of quitters. Rather, it’s unusual that even early quitters do some important work. That’s a cleverly designed project. An ethical principle of Zooniverse is to not waste people’s time. The crowdsourcing tasks are pivotal to advancing research. They cannot be accomplished by computer algorithms or machines. They require crowds of people, each chipping in a tiny bit. What is remarkable is that the quitters matter at all….
An Internet rule of thumb in that only 1% (or less) of users add new content to sites like Wikipedia. Citizen science appears to operate on this dynamic, except instead of a core group adding existing knowledge for the crowd to use, a core group is involved in making new knowledge for the crowd to use….
In citizen science, a crowd can be four or a crowd can be hundreds of thousands. A citizen scientist is not a person who will participate in any project. They are individuals – gamers, birders, stargazers, gardeners, weather bugs, hikers, naturalists, and more – with particular interests and motivations.
As my grandfather said, “Try it, you might like it.” It’s fabulous that millions are trying it. Sooner or later, when participants and projects find one another, a good match translates into a job well done….(More)”.

Motivations for sustained participation in crowdsourcing: The role of talk in a citizen science case study


Paper by CB. Jackson, C. Østerlund, G. Mugar, KDV. Hassman for the Proceedings of the Forty-eighth Hawai’i International Conference on System Science (HICSS-48): “The paper explores the motivations of volunteers in a large crowdsourcing project and contributes to our understanding of the motivational factors that lead to deeper engagement beyond initial participation. Drawing on the theory of legitimate peripheral participation (LPP) and the literature on motivation in crowdsourcing, we analyze interview and trace data from a large citizen science project. The analyses identify ways in which the technical features of the projects may serve as motivational factors leading participants towards sustained participation. The results suggest volunteers first engage in activities to support knowledge acquisition and later share knowledge with other volunteers and finally increase participation in Talk through a punctuated process of role discovery…(More)”

.

Turns Out the Internet Is Bad at Guessing How Many Coins Are in a Jar


Eric B. Steiner at Wired: “A few weeks ago, I asked the internet to guess how many coins were in a huge jar…The mathematical theory behind this kind of estimation game is apparently sound. That is, the mean of all the estimates will be uncannily close to the actual value, every time. James Surowiecki’s best-selling book, Wisdom of the Crowd, banks on this principle, and details several striking anecdotes of crowd accuracy. The most famous is a 1906 competition in Plymouth, England to guess the weight of an ox. As reported by Sir Francis Galton in a letter to Nature, no one guessed the actual weight of the ox, but the average of all 787 submitted guesses was exactly the beast’s actual weight….
So what happened to the collective intelligence supposedly buried in our disparate ignorance?
Most successful crowdsourcing projects are essentially the sum of many small parts: efficiently harvested resources (information, effort, money) courtesy of a large group of contributors. Think Wikipedia, Google search results, Amazon’s Mechanical Turk, and KickStarter.
But a sum of parts does not wisdom make. When we try to produce collective intelligence, things get messy. Whether we are predicting the outcome of an election, betting on sporting contests, or estimating the value of coins in a jar, the crowd’s take is vulnerable to at least three major factors: skill, diversity, and independence.
A certain amount of skill or knowledge in the crowd is obviously required, while crowd diversity expands the number of possible solutions or strategies. Participant independence is important because it preserves the value of individual contributors, which is another way of saying that if everyone copies their neighbor’s guess, the data are doomed.
Failure to meet any one of these conditions can lead to wildly inaccurate answers, information echo, or herd-like behavior. (There is more than a little irony with the herding hazard: The internet makes it possible to measure crowd wisdom and maybe put it to use. Yet because people tend to base their opinions on the opinions of others, the internet ends up amplifying the social conformity effect, thereby preventing an accurate picture of what the crowd actually thinks.)
What’s more, even when these conditions—skill, diversity, independence—are reasonably satisfied, as they were in the coin jar experiment, humans exhibit a whole host of other cognitive biases and irrational thinking that can impede crowd wisdom. True, some bias can be positive; all that Gladwellian snap-judgment stuff. But most biases aren’t so helpful, and can too easily lead us to ignore evidence, overestimate probabilities, and see patterns where there are none. These biases are not vanquished simply by expanding sample size. On the contrary, they get magnified.
Given the last 60 years of research in cognitive psychology, I submit that Galton’s results with the ox weight data were outrageously lucky, and that the same is true of other instances of seemingly perfect “bean jar”-styled experiments….”

Democratizing Inequalities: Dilemmas of the New Public Participation


New book edited by Caroline W. Lee, Michael McQuarrie and Edward T. Walker: “Opportunities to “have your say,” “get involved,” and “join the conversation” are everywhere in public life. From crowdsourcing and town hall meetings to government experiments with social media, participatory politics increasingly seem like a revolutionary antidote to the decline of civic engagement and the thinning of the contemporary public sphere. Many argue that, with new technologies, flexible organizational cultures, and a supportive policymaking context, we now hold the keys to large-scale democratic revitalization.
Democratizing Inequalities shows that the equation may not be so simple. Modern societies face a variety of structural problems that limit potentials for true democratization, as well as vast inequalities in political action and voice that are not easily resolved by participatory solutions. Popular participation may even reinforce elite power in unexpected ways. Resisting an oversimplified account of participation as empowerment, this collection of essays brings together a diverse range of leading scholars to reveal surprising insights into how dilemmas of the new public participation play out in politics and organizations. Through investigations including fights over the authenticity of business-sponsored public participation, the surge of the Tea Party, the role of corporations in electoral campaigns, and participatory budgeting practices in Brazil, Democratizing Inequalities seeks to refresh our understanding of public participation and trace the reshaping of authority in today’s political environment.”