What is Citizensourcing?


Citizensourcing is the crowdsourcing practice applied by governments with the goal of tapping into the collective intelligence of the citizens. Through citizensourcing, governments can collect ideas, suggestions and opinions from their citizens — thereby creating a permanent feedback loop of communication.

Cities are a powerhouse of collective intelligence. Thanks to modern technologies, time has come to unlock the wisdom of the crowd. Tweet: Cities are powerhouses of collective intelligence - time to unlock them. via @citizenlabco http://ctt.ec/7e6Q2+

Yesterday

The current means of engaging citizens in public policy are in place since the 18th century: town hall meetings, in-person visits, phone calls or bureaucratic forms that allowed you to submit an idea. All of those ways of engagement are time-consuming, ineffective and expensive.

Great ideas and valuable feedback get lost, because those forms of engagement take too much effort for both citizens and cities. And next to that, communication happens in private between city government and citizens. Citizens cannot communicate with each other about how they want to improve their city.

Today

Advances in technology have restructured the way societies are organised; we’re living a digital age in which citizens are connected over networks. This creates unseen opportunities for cities to get closer to their citizens and serve them better. In the last years, we’ve seen several cities trying to build a strong online presence on social media channels.

Yet, they have discovered that communicating with their citizens over Twitter and Facebook is far from optimal. Messages get lost in the information overload that characterises those platforms, resulting in a lack of structured communication.

Tomorrow

Imagine that your town hall meetings could be held online… but then 24/7, accessible from every possible device. Citizensourcing on a dedicated platform is an inexpensive way for cities to get valuable input in the form of ideas, feedback and opinions from their citizens.

Whereas only a very small proportion of citizens engage in the time-consuming offline participation, an online platform allows you to multiply your reach by tenfolds. You reach an audience of citizens that you couldn’t reach before, which makes an online platform a well-needed complement for the already existing offline channels in every city.

When citizens can share their ideas in an easy and fun way and get rewarded for their valuable input, that’s when the wisdom of the crowd gets truly unlocked.

The most direct benefit for cities is clear: crowdsourcing new urban ideas drives superior innovations. At least as important as the fact that you offer a new channel for proposals, is that engagement leads to a better understanding of the different needs citizens have…..

There are several early success stories that show the gigantic potential though:

  • The Colombian city Medellín has its own crowdsourcing platform MiMedellín on which citizens share their urban solutions for problems the city faces. It turned out to be a big success: having collected more than 2,300 (!) posted ideas, the government is already developing policies with help from the creativity of citizens.
  • In the Icelandic capital, Reykjavik, the city council succeeded in having their citizensourcing website Better Reykjavik used by over 60% of the citizens. Since Reykjavik implemented their city platform, they have spent €1.9 million on developing more than 200 projectsbased on ideas from citizens..
  • Paris held a participatory budgeting process, called ‘Madame Mayor, I have an idea’, that brought forward wonderful proejcts. To name one, after having received well over 20,000 votes, the city government announced to invest €2 million in vertical garden projects. Other popular ideas included gardens in schools, neighbourhood recycling centers and co-working spaces for students and entrepreneurs….(More)”

The big cost of using big data in elections


Michael McDonald, Peter Licari and Lia Merivaki in the Washington Post: “In modern campaigns, buzzwords like “microtargeting” and “big data” are often bandied about as essential to victory. These terms refer to the practice of analyzing (or “microtargeting”) millions of voter registration records (“big data”) to predict who will vote and for whom.

If you’ve ever gotten a message from a campaign, there’s a good chance you’ve been microtargeted. Serious campaigns use microtargeting to persuade voters through mailings, phone calls, knocking on doors, and — in our increasingly connected world — social media.

But the big data that fuels such efforts comes at a big price, which can create a serious barrier to entry for candidates and groups seeking to participate in elections — that is, if they are allowed to buy the data at all.

When we asked state election officials about prices and restrictions on who can use their voter registration files, we learned that the rules are unsettlingly arbitrary.

Contrast Arizona and Washington. Arizona sells its statewide voter file for an estimated $32,500, while Washington gives its file away for free. Before jumping to the conclusion that this is a red- state/blue-state thing, consider that Oklahoma gives its file away, too.

A number of states base their prices on a per-record formula, which can massively drive up the price despite the fact that files are often delivered electronically. Alabama sells its records for 1 cent per voter , which yields an approximately $30,000 charge for the lot. Seriously, in this day and age, who prices an electronic database by the record?

Some states will give more data to candidates than to outside groups. Delaware will provide phone numbers to candidates but not to nonprofit organizations doing nonpartisan voter mobilization.

In some states, the voter file is not even available to the general public. States such as South Carolina and Maryland permit access only to residents who are registered voters. States including Kentucky and North Dakota grant access only to campaigns, parties and other political organizations.

We estimate that it would cost roughly $140,000 for an independent presidential campaign or national nonprofit organization to compile a national voter file, and this would not be a one-time cost. Voter lists frequently change as voters are added and deleted.

Guess who most benefits from all the administrative chaos? Political parties and their candidates. Not only are they capable of raising the vast amounts of money needed to purchase the data, but, adding insult to injury, they sometimes don’t even have to. Some states literally bequeath the data to parties at no cost. Alabama goes so far as to give parties a free statewide copy for every election.

Who is hurt by this? Independent candidates and nonprofit organizations that want to run national campaigns but don’t have deep pockets. If someone like Donald Trump launched an independent presidential run, he could buy the necessary data without much difficulty. But a nonprofit focused on mobilizing low-income voters could be stretched thin….(More)”

Syrians discover new use for mobile phones – finding water


Magdalena Mis at Reuters: “Struggling with frequent water cuts, residents of Syria‘s battered city of Aleppo have a new way to find the water needed for their daily lives – an interactive map on mobile phones.

The online map, created by the Red Cross and accessible through mobile phones with 3G technology, helps to locate the closest of over 80 water points across the divided city of 2 million and guides them to it using a Global Positioning System.

“The map is very simple and works on every phone, and everybody now has access to a mobile phone with 3G,” International Committee of the Red Cross (ICRC) spokesman Pawel Krzysiek told the Thomson Reuters Foundation in a phone interview from Damascus on Wednesday.

“The important thing is that it’s not just a map – which many people may not know how to read – it’s the GPS that’s making a difference because people can actually be guided to the water point closest to them,” he said.

Aleppo was Syria’s most populated city and commercial hub before the civil war erupted in 2011, but many areas have been reduced to rubble and the city has been carved up between government forces and various insurgent groups.

Water cuts are a regular occurrence, amounting to about two weeks each month, and the infrastructure is on the brink of collapse, Krzysiek said.

The water supply was restored on Wednesday after a four-day cut caused by damage to the main power line providing electricity to some 80 percent of households, Krzysiek said.

More cuts are likely because fighting is preventing engineers from repairing the power line, and diesel, used for standby generators, may run out, he added….

Krzysiek said the ICRC started working on the map after a simple version created for engineers was posted on its Facebook page in the summer, sparking a wave of comments and requests.

“Suddenly people started to share this map and were sending comments on how to improve it and asking for a new, more detailed one.”

Krzysiek said that about 140,000 people were using the old version of the map and 20,000 had already used the new version, launched on Monday…(More)”

Crowdsourced research: Many hands make tight work


 

Raphael Silberzahn & Eric L. Uhlmann in Nature: “…For many research problems, crowdsourcing analyses will not be the optimal solution. It demands a huge amount of resources for just one research question. Some questions will not benefit from a crowd of analysts: researchers’ approaches will be much more similar for simple data sets and research designs than for large and complex ones. Importantly, crowdsourcing does not eliminate all bias. Decisions must still be made about what hypotheses to test, from where to get suitable data, and importantly, which variables can or cannot be collected. (For instance, we did not consider whether a particular player’s skin tone was lighter or darker than that of most of the other players on his team.) Finally, researchers may continue to disagree about findings, which makes it challenging to present a manuscript with a clear conclusion. It can also be puzzling: the investment of more resources can lead to less-clear outcomes.

“Under the current system, strong storylines win out over messy results.”

Still, the effort can be well worth it. Crowdsourcing research can reveal how conclusions are contingent on analytical choices. Furthermore, the crowdsourcing framework also provides researchers with a safe space in which they can vet analytical approaches, explore doubts and get a second, third or fourth opinion. Discussions about analytical approaches happen before committing to a particular strategy. In our project, the teams were essentially peer reviewing each other’s work before even settling on their own analyses. And we found that researchers did change their minds through the course of analysis.

Crowdsourcing also reduces the incentive for flashy results. A single-team project may be published only if it finds significant effects; participants in crowdsourced projects can contribute even with null findings. A range of scientific possibilities are revealed, the results are more credible and analytical choices that seem to sway conclusions can point research in fruitful directions. What is more, analysts learn from each other, and the creativity required to construct analytical methodologies can be better appreciated by the research community and the public.

Of course, researchers who painstakingly collect a data set may not want to share it with others. But greater certainty comes from having an independent check. A coordinated effort boosts incentives for multiple analyses and perspectives in a way that simply making data available post-publication does not.

The transparency resulting from a crowdsourced approach should be particularly beneficial when important policy issues are at stake. The uncertainty of scientific conclusions about, for example, the effects of the minimum wage on unemployment, and the consequences of economic austerity policies should be investigated by crowds of researchers rather than left to single teams of analysts.

Under the current system, strong storylines win out over messy results. Worse, once a finding has been published in a journal, it becomes difficult to challenge. Ideas become entrenched too quickly, and uprooting them is more disruptive than it ought to be. The crowdsourcing approach gives space to dissenting opinions.

Scientists around the world are hungry for more-reliable ways to discover knowledge and eager to forge new kinds of collaborations to do so. Our first project had a budget of zero, and we attracted scores of fellow scientists with two tweets and a Facebook post.

Researchers who are interested in starting or participating in collaborative crowdsourcing projects can access resources available online. We have publicly shared all our materials and survey templates, and the Center for Open Science has just launched ManyLab, a web space where researchers can join crowdsourced projects….(More).

See also Nature special collection:reproducibility

 

Big Data and Mass Shootings


Holman W. Jenkins in the Wall Street Journal: “As always, the dots are connected after the fact, when the connecting is easy. …The day may be coming, sooner than we think, when such incidents can be stopped before they get started. A software program alerts police to a social-media posting by an individual of interest in their jurisdiction. An algorithm reminds them why the individual had become a person of interest—a history of mental illness, an episode involving a neighbor. Months earlier, discreet inquires by police had revealed an unhealthy obsession with weapons—key word, unhealthy. There’s no reason why gun owners, range operators and firearms dealers shouldn’t be a source of information for local police seeking information about who might merit special attention.

Sound scary? Big data exists to find the signal among the noise. Your data is the noise. It’s what computerized systems seek to disregard in their quest for information that actually would be useful to act on. Big data is interested in needles, not hay.

Still don’t trust the government? You’re barking up an outdated tree. Consider the absurdly ancillary debate last year on whether the government should be allowed to hold telephone “metadata” when the government already holds vastly more sensitive data on all of us in the form of tax, medical, legal and census records.

All this seems doubly silly given the spacious information about each of us contained in private databases, freely bought and sold by marketers. Bizarre is the idea that Facebook should be able to use our voluntary Facebook postings to decide what we might like to buy, but police shouldn’t use the same information to prevent crime.

Hitachi, the big Japanese company, began testing its crime-prediction software in several unnamed American cities this month. The project, called Hitachi Visualization Predictive Crime Analytics, culls crime records, map and transit data, weather reports, social media and other sources for patterns that might otherwise go unnoticed by police.

Colorado-based Intrado, working with LexisNexis and Motorola Solutions, already sells police a service that instantly scans legal, business and social-media records for information about persons and circumstances that officers may encounter when responding to a 911 call at a specific address. Hundreds of public safety agencies find the system invaluable though that didn’t stop the city of Bellingham, Wash., from rejecting it last year on the odd grounds that such software must be guilty of racial profiling.

Big data is changing how police allocate resources and go about fighting crime. …It once was freely asserted that police weren’t supposed to prevent crime, only solve it. But recent research shows investment in policing actually does reduce crime rates—and produces a large positive return measured in dollars and cents. A day will come when failing to connect the dots in advance of a mass-shooting won’t be a matter for upturned hands. It will be a matter for serious recrimination…(More)

We Need Both Networks and Communities


Henry Mintzberg at HBR: “If you want to understand the difference between a network and a community, ask your Facebook friends to help paint your house.

Social media certainly connects us to whoever is on the other end of the line, and so extends our social networks in amazing ways. But this can come at the expense of deeper personal relationships. When it feels like we’re up-to-date on our friends’ lives through Facebook or Instagram, we may become less likely to call them, much less meet up. Networks connect; communities care.

….A century or two ago, the word community “seemed to connote a specific group of people, from a particular patch of earth, who knew and judged and kept an eye on one another, who shared habits and history and memories, and could at times be persuaded to act as a whole on behalf of a part.” In contrast,the word has now become fashionable to describe what are really networks, as in the “business community”—”people with common interests [but] not common values, history, or memory.”

Does this matter for managing in the digital age, even for dealing with our global problems? It sure does. In a 2012 New York Times column, Thomas Friedman reported asking an Egyptian friend about the protest movements in that country: “Facebook really helped people to communicate, but not to collaborate,” he replied. Friedman added that “at their worst, [social media sites] can become addictive substitutes for real action.” That is why, while the larger social movements, as in Cairo’s Tahrir Square or on Wall Street, may raise consciousness about the need for renewal in society, it is the smaller social initiatives, usually developed by small groups in communities, that do much of the renewing….

We tend to make a great fuss about leadership these days, but communityship is more important. The great leaders create, enhance, and support a sense of community in their organizations, and that requires hands-on management. Hence managers have get beyond their individual leadership, to recognize the collective nature of effective enterprise.

Especially for operating around the globe, electronic communication has become essential. But the heart of enterprise remains rooted in personal collaborative relationships, albeit networked by the new information technologies. Thus, in localities and organizations, across societies and around the globe, beware of “networked individualism“ where people communicate readily while struggling to collaborate.

The new digital technologies, wonderful as they are in enhancing communication, can have a negative effect on collaboration unless they are carefully managed. An electronic device puts us in touch with a keyboard, that’s all….(More)”

A new model to explore non-profit social media use for advocacy and civic engagement


David Chapman, Katrina Miller-Stevens, John C Morris, and Brendan O’Hallarn in First Monday: “In an age when electronic communication is ubiquitous, non-profit organizations are actively using social media platforms as a way to deliver information to end users. In spite of the broad use of these platforms, little scholarship has focused on the internal processes these organizations employ to implement these tools. A limited number of studies offer models to help explain an organization’s use of social media from initiation to outcomes, yet few studies address a non-profit organization’s mission as the driver to employ social media strategies and tactics. Furthermore, the effectiveness of social media use is difficult for non-profit organizations to measure. Studies that attempt to address this question have done so by viewing social media platform analytics (e.g., Facebook analytics) or analyzing written content by users of social media (Nah and Saxton, 2013; Auger, 2013; Uzunoğlu and Misci Kip, 2014; or Guo and Saxton, 2014). The value added of this study is to present a model for practice (Weil, 1997) that explores social media use and its challenges from a non-profit organization’s mission through its desired outcome, in this case an outcome of advocacy and civic engagement.

We focus on one non-profit organization, Blue Star Families, that actively engages in advocacy and civic engagement. Blue Star Families was formed in 2009 to “raise the awareness of the challenges of military family life with our civilian communities and leaders” (Blue Star Families, 2010). Blue Star Families is a virtual organization with no physical office location. Thus, the organization relies on its Web presence and social media tools to advocate for military families and engage service members and their families, communities, and citizens in civic engagement activities (Blue Star Families, 2010).

The study aims to provide organizational-level insights of the successes and challenges of working in the social media environment. Specifically, the study asks: What are the processes non-profit organizations follow to link organizational mission to outcomes when using social media platforms? What are the successes and challenges of using social media platforms for advocacy and civic engagement purposes? In our effort to answer these questions, we present a new model to explore non-profit organizations’ use of social media platforms by building on previous models and frameworks developed to explore the use of social media in the public, private, and non-profit sectors.

This research is important for three reasons. First, most previous studies of social media tend to employ models that focus on the satisfaction of the social media tools for organizational members, rather than the utility of social media as a tool to meet organizational goals. Our research offers a means to explore the utility of social media from an organization perspective. Second, the exemplar case for our research, Blue Star Families, Inc., is a non-profit organization whose mission is to create and nurture a virtual community spread over a large geographical — if not global — area. Because Blue Star Families was founded as an online organization that could not exist without social media, it provides a case for which social media is a critical component of the organization’s activity. Finally, we offer some “lessons learned” from our case to identify issues for other organizations seeking to create a significant social media presence.

This paper is organized as follows: first, the growth of social media is briefly addressed to provide background context. Second, previous models and frameworks exploring social media are discussed. This is followed by a presentation of a new model exploring the use of social media from an organizational perspective, starting with the driver of a non-profit organization’s mission, to its desired outcomes of advocacy and civic engagement. Third, the case study methodology is explained. Next, we present an analysis and discussion applying the new model to Blue Star Families’ use of social media platforms. We conclude by discussing the challenges of social media revealed in the case study analysis, and we offer recommendations to address these challenges….(More)”

How the USGS uses Twitter data to track earthquakes


Twitter Blog: “After the disastrous Sichuan earthquake in 2008, people turned to Twitter to share firsthand information about the earthquake. What amazed many was the impression that Twitter was faster at reporting the earthquake than the U.S. Geological Survey (USGS), the official government organization in charge of tracking such events.

This Twitter activity wasn’t a big surprise to the USGS. The USGS National Earthquake Information Center (NEIC) processes about 2,000 realtime earthquake sensors, with the majority based in the United States. That leaves a lot of empty space in the world with no sensors. On the other hand, there are hundreds of millions of people using Twitter who can report earthquakes. At first, the USGS staff was a bit skeptical that Twitter could be used as a detection system for earthquakes – but when they looked into it, they were surprised at the effectiveness of Twitter data for detection.

USGS staffers Paul Earle, a seismologist, and Michelle Guy, a software developer, teamed up to look at how Twitter data could be used for earthquake detection and verification. By using Twitter’s Public API, they decided to use the same time series event detection method they use when detecting earthquakes. This gave them a baseline for earthquake-related chatter, but they decided to dig in even further. They found that people Tweeting about actual earthquakes kept their Tweets really short, even just to ask, “earthquake?” Concluding that people who are experiencing earthquakes aren’t very chatty, they started filtering out Tweets with more than seven words. They also recognized that people sharing links or the size of the earthquake were significantly less likely to be offering firsthand reports, so they filtered out any Tweets sharing a link or a number. Ultimately, this filtered stream proved to be very significant at determining when earthquakes occurred globally.

USGS Modeling Twitter Data to Detect Earthquakes

While I was at the USGS office in Golden, Colo. interviewing Michelle and Paul, three earthquakes happened in a relatively short time. Using Twitter data, their system was able to pick up on an aftershock in Chile within one minute and 20 seconds – and it only took 14 Tweets from the filtered stream to trigger an email alert. The other two earthquakes, off Easter Island and Indonesia, weren’t picked up because they were not widely felt…..

The USGS monitors for earthquakes in many languages, and the words used can be a clue as to the magnitude and location of the earthquake. Chile has two words for earthquakes: terremotoand temblor; terremoto is used to indicate a bigger quake. This one in Chile started with people asking if it was a terremoto, but others realizing that it was a temblor.

As the USGS team notes, Twitter data augments their own detection work on felt earthquakes. If they’re getting reports of an earthquake in a populated area but no Tweets from there, that’s a good indicator to them that it’s a false alarm. It’s also very cost effective for the USGS, because they use Twitter’s Public API and open-source software such as Kibana and ElasticSearch to help determine when earthquakes occur….(More)”

The Quantified Community and Neighborhood Labs: A Framework for Computational Urban Planning and Civic Technology Innovation


Constantine E. Kontokosta: “This paper presents the conceptual framework and justification for a “Quantified Community” (QC) and a networked experimental environment of neighborhood labs. The QC is a fully instrumented urban neighborhood that uses an integrated, expandable, and participatory sensor network to support the measurement, integration, and analysis of neighborhood conditions, social interactions and behavior, and sustainability metrics to support public decision-making. Through a diverse range of sensor and automation technologies — combined with existing data generated through administrative records, surveys, social media, and mobile sensors — information on human, physical, and environmental elements can be processed in real-time to better understand the interaction and effects of the built environment on human well-being and outcomes. The goal is to create an “informatics overlay” that can be incorporated into future urban development and planning that supports the benchmarking and evaluation of neighborhood conditions, provides a test-bed for measuring the impact of new technologies and policies, and responds to the changing needs and preferences of the local community….(More)”

French digital rights bill published in ‘open democracy’ first


France24: “A proposed law on the Internet and digital rights in France has been opened to public consultation before it is debated in parliament in an “unprecedented” exercise in “open democracy”.

The text of the “Digital Republic” bill was published online on Saturday and is open to suggestions for amendments by French citizens until October 17.

It can be found on the “Digital Republic” web page, and is even available in English.

“We are opening a new page in the history of our democracy,” Prime Minister Manuel Valls said at a press conference as the consultation was launched. “This is the first time in France, or indeed in any European country, that a proposed law has been opened to citizens in this way.”

“And it won’t be the last time,” he said, adding that the move was an attempt to redress a “growing distrust of politics”.

Participants will be able to give their opinions and make suggestions for changes to the text of the bill.

Suggestions that get the highest number of public votes will be guaranteed an official response before the bill is presented to parliament.

Freedoms and fairness

In its original and unedited form, the text of the bill pushes heavily towards online freedoms as well as improving the transparency of government.

An “Open Data” policy would make official documents and public sector research available online, while a “Net Neutrality” clause would prevent Internet services such as Netflix or YouTube from paying for faster connection speeds at the expense of everyone else.

For personal freedoms, the law would allow citizens the right to recover emails, files and other data such as pictures stored on “cloud” services….(More)”