Why This Company Is Crowdsourcing, Gamifying The World's Most Difficult Problems


FastCompany: “The biggest consultancy firms–the McKinseys and Janeses of the world–make many millions of dollars predicting the future and writing what-if reports for clients. This model is built on the idea that those companies know best–and that information and ideas should be handed down from on high.
But one consulting house, Wikistrat, is upending the model: Instead of using a stable of in-house analysts, the company crowdsources content and pays the crowd for its time. Wikistrat’s hundreds of analysts–primarily consultants, academics, journalists, and retired military personnel–are compensated for participating in what they call “crowdsourced simulations.” In other words, make money for brainstorming.

According to Joel Zamel, Wikistrat’s founder, approximately 850 experts in various fields rotate in and out of different simulations and project exercises for the company. While participating in a crowdsourced simulation, consultants are are paid a flat fee plus performance bonuses based on a gamification engine where experts compete to win extra cash. The company declined revealing what the fee scale is, but as of 2011 bonus money appears to be in the $10,000 range.
Zamel characterizes the company’s clients as a mix of government agencies worldwide and multinational corporations. The simulations are semi-anonymous for players; consultants don’t know who their paper is being written for or who the end consumer is, but clients know which of Wikistrat’s contestants are participating in the brainstorm exercise. Once an exercise is over, the discussions from the exercise are taken by full-time employees at Wikistrat and converted into proper reports for clients.
“We’ve developed a quite significant crowd network and a lot of functionality into the platform,” Zamel tells Fast Company. “It uses a gamification engine we created that incentivizes analysts by ranking them at different levels for the work they do on the platform. They are immediately rewarded through the engine, and we also track granular changes made in real time. This allows us to track analyst activity and encourages them to put time and energy into Wiki analysis.” Zamel says projects typically run between three and four weeks, with between 50 and 100 analysts working on a project for generally between five and 12 hours per week. Most of the analysts, he says, view this as a side income on top of their regular work at day jobs but some do much more: Zamel cited one PhD candidate in Australia working 70 hours a week on one project instead of 10 to 15 hours.
Much of Wikistrat’s output is related to current events. Although Zamel says the bulk of their reports are written for clients and not available for public consumption, Wikistrat does run frequent public simulations as a way of attracting publicity and recruiting talent for the organization. Their most recent crowdsourced project is called Myanmar Moving Forward and runs from November 25 to December 9. According to Wikistrat, they are asking their “Strategic community to map out Myanmar’s current political risk factor and possible futures (positive, negative, or mixed) for the new democracy in 2015. The simulation is designed to explore the current social, political, economic, and geopolitical threats to stability–i.e. its political risk–and to determine where the country is heading in terms of its social, political, economic, and geopolitical future.”…

Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

Public Open Sensor Data: Revolutionizing Smart Cities


New Paper in Technology and Society Magazine, IEEE (Volume: 32,  Issue: 4): “Local governments have decided to take advantage of the presence of wireless sensor networks (WSNs) in their cities to efficiently manage several applications in their daily responsibilities. The enormous amount of information collected by sensor devices allows the automation of several real-time services to improve city management by using intelligent traffic-light patterns during rush hour, reducing water consumption in parks, or efficiently routing garbage collection trucks throughout the city [1]. The sensor information required by these examples is mostly self-consumed by city-designed applications and managers.”

Using Cognitive-Behavioural Therapy to reduce crime


Paper of the week selected by the British Nudge Unit: “The paper we would like to highlight this week describes a large-scale randomised controlled trial aimed at improving life outcomes for disadvantaged youth.

Preventing youth violence and dropout: a randomized field experiment (April 25, 2013)
Sara B. Heller, Harold A. Pollack, Roseanna Ander and Jens Ludwig
The paper presents promising results from a trial in which teenagers were offered after-school programming in combination with Cognitive Behavioural Therapy. Cognitive Behavioural Therapy helps people reconsider biased beliefs they have about the world. In the domain of crime this includes learning to avoid overestimating the extent to which others want to hurt or harm someone else (i.e. hostile attribution bias).
The trial was run in 18 public schools in some of Chicago’s most disadvantaged neighbourhoods. 2,740 young males were offered either to participate in the programme (treatment group) or not (control group). Around half of those in the treatment group actually participated – on average for 13 sessions over the course of a year. Offering them the programme reduced the rate at which they were arrested for violent crime by 3.3 percentage points, from 16.7 percentage points in the control group. This is a fall of 20%.
Click here to access the paper”

David Cameron's Transparency Revolution? The Impact of Open Data in the UK


Paper by Ben Worthy: “This article examines the impact of the UK Government’s Transparency agenda, focusing on the publication of spending data at local government level. It measures the democratic impact in terms of creating transparency and accountability, public participation and everyday information. The study uses a survey of local authorities, interviews and FOI requests to build a picture of use and impact.
It argues that the spending has led to some accountability, though from those already monitoring government rather than citizens. It has not led to increased participation, as it lacks the narrative or accountability instruments to fully use it. Nor has it created a new stream of information to underpin citizen choice, though new innovations offer this possibility.
The evidence points to third party innovations as the key. They can contextualise and ‘localise’ information and may also provide the comparison to the first step in more effective accountability.
The superficially simple and neutral reforms conceal complex political dynamics. The very design lends itself to certain framing effects, further compounded by assumptions and blurred concepts and a lack of accountability instruments to enforce problems raised by the data.”

Selected Readings on Smart Disclosure


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of smart disclosure was originally published in 2013.

While much attention is paid to open data, data transparency need not be managed by a simple On/Off switch: It’s often desirable to make specific data available to the public or individuals in targeted ways. A prime example is the use of government data in Smart Disclosure, which provides consumers with data they need to make difficult marketplace choices in health care, financial services, and other important areas. Governments collect two kinds of data that can be used for Smart Disclosure: First, governments collect information on services of high interest to consumers, and are increasingly releasing this kind of data to the public. In the United States, for example, the Department of Health and Human Services collects and releases online data on health insurance options, while the Department of Education helps consumers understand the true cost (after financial aid) of different colleges. Second, state, local, or national governments hold information on consumers themselves that can be useful to them. In the U.S., for example, the Blue Button program was launched to help veterans easily access their own medical records.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Better Choices: Better Deals Report on Progress in the Consumer Empowerment Strategy. Progress Report. Consumer Empowerment Strategy. United Kingdom: Department for Business Innovation & Skills, December 2012. http://bit.ly/17MqnL3.

  • The report details the progress made through the United Kingdom’s consumer empowerment strategy, Better Choices: Better Deals. The plan seeks to mitigate knowledge imbalances through information disclosure programs and targeted nudges.
  • The empowerment strategy’s four sections demonstrate the potential benefits of Smart Disclosure: 1. The power of information; 2. The power of the crowd; 3. Helping the vulnerable; and 4. A new approach to Government working with business.
Braunstein, Mark L.,. “Empowering the Patient.” In Health Informatics in the Cloud, 67–79. Springer Briefs in Computer Science. Springer New York Heidelberg Dordrecht London, 2013. https://bit.ly/2UB4jTU.
  • This book discusses the application of computing to healthcare delivery, public health and community based clinical research.
  • Braunstein asks and seeks to answer critical questions such as: Who should make the case for smart disclosure when the needs of consumers are not being met? What role do non-profits play in the conversation on smart disclosure especially when existing systems (or lack thereof) of information provision do not work or are unsafe?

Brodi, Elisa. “Product-Attribute Information” and “Product-Use Information”: Smart Disclosure and New Policy Implications for Consumers’ Protection. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, September 4, 2012. http://bit.ly/17hssEK.

  • This paper from the Research Area of the Bank of Italy’s Law and Economics Department “surveys the literature on product use information and analyzes whether and to what extent Italian regulator is trying to ensure consumers’ awareness as to their use pattern.” Rather than focusing on the type of information governments can release to citizens, Brodi proposes that governments require private companies to provide valuable use pattern information to citizens to inform decision-making.
  • The form of regulation proposed by Brodi and other proponents “is based on a basic concept: consumers can be protected if companies are forced to disclose data on the customers’ consumption history through electronic files.”
National Science and Technology Council. Smart Disclosure and Consumer Decision Making: Report of the Task Force on Smart Disclosure. Task Force on Smart Disclosure: Information and Efficiency in Consumer Markets. Washington, DC: United States Government: Executive Office of the President, May 30, 2013. http://1.usa.gov/1aamyoT.
    • This inter-agency report is a comprehensive description of smart disclosure approaches being used across the Federal Government. The report not only highlights the importance of making data available to consumers but also to innovators to build better options for consumers.
  • In addition to providing context about government policies that guide smart disclosure initiatives, the report raises questions about what parties have influence in this space.

“Policies in Practice: The Download Capability.” Markle Connecting for Health Work Group on Consumer Engagement, August 2010. http://bit.ly/HhMJyc.

  • This report from the Markle Connecting for Health Work Group on Consumer Engagement — the creator of the Blue Button system for downloading personal health records — features a “set of privacy and security practices to help people download their electronic health records.”
  • To help make health information easily accessible for all citizens, the report lists a number of important steps:
    • Make the download capability a common practice
    • Implement sound policies and practices to protect individuals and their information
    • Collaborate on sample data sets
    • Support the download capability as part of Meaningful Use and qualified or certified health IT
    • Include the download capability in procurement requirements.
  • The report also describes the rationale for the development of the Blue Button — perhaps the best known example of Smart Disclosure currently in existence — and the targeted release of health information in general:
    • Individual access to information is rooted in fair information principles and law
    • Patients need and want the information
    • The download capability would encourage innovation
    • A download capability frees data sources from having to make many decisions about the user interface
    • A download capability would hasten the path to standards and interoperability.
Sayogo, Djoko Sigit, and Theresa A. Pardo. “Understanding Smart Data Disclosure Policy Success: The Case of Green Button.” In Proceedings of the 14th Annual International Conference on Digital Government Research, 72–81. New York: ACM New York, NY, USA, 2013. http://bit.ly/1aanf1A.
  • This paper from the Proceedings of the 14th Annual International Conference on Digital Government Research explores the implementation of the Green Button Initiative, analyzing qualitative data from interviews with experts involved in Green Button development and implementation.
  • Moving beyond the specifics of the Green Button initiative, the authors raise questions on the motivations and success factors facilitating successful collaboration between public and private organizations to support smart disclosure policy.

Thaler, Richard H., and Will Tucker. “Smarter Information, Smarter Consumers.” Harvard Business Review January – February 2013. The Big Idea. http://bit.ly/18gimxw.

  • In this article, Thaler and Tucker make three key observations regarding the challenges related to smart disclosure:
    • “We are constantly confronted with information that is highly important but extremely hard to navigate or understand.”
    • “Repeated attempts to improve disclosure, including efforts to translate complex contracts into “plain English,” have met with only modest success.”
    • “There is a fundamental difficulty of explaining anything complex in simple terms. Most people find it difficult to write instructions explaining how to tie a pair of shoelaces.

E-Government and Its Limitations: Assessing the True Demand Curve for Citizen Public Participation


Paper by David Karpf: “Many e-government initiatives start with promise, but end up either as digital “ghost towns” or as a venue exploited by organized interests. The problem with these initiatives is rooted in a set of common misunderstandings about the structure of citizen interest in public participation – simply put, the Internet does not create public interest, it $2 public interest. Public interest can be high or low, and governmental initiatives can be polarized or non-polarized. The paper discusses two common pitfalls (“the Field of Dreams Fallacy” and “Blessed are the Organized”) that demand alternate design choices and modified expectations. By treating public interest and public polarization as variables, the paper develops a typology of appropriate e-government initiatives that can help identify the boundary conditions for transformative digital engagement.”

DataViva: a Big Data Engine for the Brazilian Economy


Piece by André Victor dos Santos Barrence and Cesar A. Hidalgo: “The current Internet paradigm in which one can search about anything and retrieve information is absolutely empowering. We can browse files, websites and indexes and effortlessly reach good amount of information. Google, for instance, was explicitly built on a library analogy available to everyone. However, it is a world where information that should be easily accessible is still hidden in unfriendly databases, and that the best-case scenario is finding few snippets of information embedded within the paragraphs of a report. But is this the way it should be? Or is this just the world we are presently stuck with?
The last decade has been particularly marked by an increasing hype on big data and analytics, mainly fueled by those who are interested in writing narratives on the topic but not necessarily coding about it, even when data itself is not the problem.
Let’s take the case of governments. Governments have plenty of data and in many cases it is actually public (at least in principle). Governments “know” how many people work in every occupation, in every industry and in every location; they know their salaries, previous employers and education history. From a pure data perspective all that is embedded in tax, social security records or annual registrations. From a more pragmatic perspective, it is still inaccessible and hidden even when it is legally open the public. We live in a world where the data is there, but where the statistics and information are not.
The state government of Minas Gerais in Brazil (3rd economy the country, territory larger than France and 20 millions inhabitants) made an important step in that direction by releasing DataViva.info, a platform that opens data for exports and occupations for the entire formal sector of the Brazilian economy through more than 700 million interactive visualizations. Instead of poorly designed tables and interfaces, it guides users to answer questions or freely discover locations, industries and occupations in Brazil that are of interest to them. DataViva allows users to explore simple questions such as the evolution of exports in the last decade for each of the 5,567 municipalities in the country, or highly specific queries, for instance, the average salaries paid to computer scientists working in the software development industry in Belo Horizonte, the state capital of Minas.
DataViva’s visualizations are built on the idea that the industrial and economic activity development of locations is highly path dependent. This means that locations are more likely to be successful at developing industries and activities that are related to the ones already existing, since it indicates the existence of labor inputs, and other capabilities, that are specific and that can often be redeployed to a few related industries and activities. Thus, it informs the processes by which opportunities can be explored and prospective pathways for greater prosperity.
The idea that information is key for the functioning of economies is at least as old as Friedrich Hayek’s seminal paper The Use of Knowledge in Society from 1945. According to Hayek, prices help coordinate economic activities by providing information about the wants and needs of goods and services. Yet, the price information can only serve as a signal as long as people know those prices. Maybe the salaries for engineers in the municipality of Betim (Minas Gerais) are excellent and indicate a strong need for them? But who would have known how many engineers are there in Betim and what are their average salaries?
But the remaining question is: why is Minas Gerais making all of this public data easily available? More than resorting to the contemporary argument of open government Minas understands this is extremely valuable information for investors searching for business opportunities, entrepreneurs pursuing new ventures or workers looking for better career prospects. Lastly, the ultimate goal of DataViva is to provide a common ground for open discussions, moving away from the information deprived idea of central planning and into a future where collaborative planning might become the norm. It is a highly creative attempt to renew public governance for the 21st century.
Despite being a relatively unknown state outside of Brazil, by releasing a platform as DataViva, Minas is providing a strong signal about where in world governments are really pushing forward innovation rather than simply admiring and copying solutions that used to come from trendsetters in the developed world. It seems like real innovation isn’t necessarily taking place in Washington, Paris or London anymore.”
 

New business models for open data in the digital economy: a preliminary assessment of the literature


Conference paper by Bonina, Carla M. and Elaluf-Calderwood, Silvia: “There is increasing excitement about the potential economic and social benefits of using newly released data in open access format (open data). For the government, open data offers potential for improving public service delivery, transparency and efficiency of operations. Open data bring also promising opportunities to generate innovation and economic growth in the economy. By releasing open data, individuals and companies will build new products and services that can feed back into the economy and promote economic growth. Despite recent advances on the matter, the business models that may help extracting the potential value of open data are not well understood. In this research in progress, we review possible directions in the literature to address in what ways open data may be a source for new business and innovation, as well as what challenges and potential barriers emerge on its take up.”

Owning the city: New media and citizen engagement in urban design


Paper by Michiel de Lange and Martijn de Waal in First Monday : “In today’s cities our everyday lives are shaped by digital media technologies such as smart cards, surveillance cameras, quasi–intelligent systems, smartphones, social media, location–based services, wireless networks, and so on. These technologies are inextricably bound up with the city’s material form, social patterns, and mental experiences. As a consequence, the city has become a hybrid of the physical and the digital. This is perhaps most evident in the global north, although in emerging countries, like Indonesia and China mobile phones, wireless networks and CCTV cameras have also become a dominant feature of urban life (Castells, et al., 2004; Qiu, 2007, 2009; de Lange, 2010). What does this mean for urban life and culture? And what are the implications for urban design, a discipline that has hitherto largely been concerned with the city’s built form?
In this contribution we do three things. First we take a closer look at the notion of ‘smart cities’ often invoked in policy and design discourses about the role of new media in the city. In this vision, the city is mainly understood as a series of infrastructures that must be managed as efficiently as possible. However, critics note that these technological imaginaries of a personalized, efficient and friction–free urbanism ignore some of the basic tenets of what it means to live in cities (Crang and Graham, 2007).
Second, we want to fertilize the debates and controversies about smart cities by forwarding the notion of ‘ownership’ as a lens to zoom in on what we believe is the key question largely ignored in smart city visions: how to engage and empower citizens to act on complex collective urban problems? As is explained in more detail below, we use ‘ownership’ not to refer to an exclusive proprietorship but to an inclusive form of engagement, responsibility and stewardship. At stake is the issue how digital technologies shape the ways in which people in cities manage coexistence with strangers who are different and who often have conflicting interests, and at the same time form new collectives or publics around shared issues of concern (see, for instance, Jacobs, 1992; Graham and Marvin, 2001; Latour, 2005). ‘Ownership’ teases out a number of shifts that take place in the urban public domain characterized by tensions between individuals and collectives, between differences and similarities, and between conflict and collaboration.
Third, we discuss a number of ways in which the rise of urban media technologies affects the city’s built form. Much has been said and written about changing spatial patterns and social behaviors in the media city. Yet as the editors of this special issue note, less attention has been paid to the question how urban new media shape the built form. The notion of ownership allows us to figure the connection between technology and the city as more intricate than direct links of causality or correlation. Therefore, ownership in our view provides a starting point for urban design professionals and citizens to reconsider their own role in city making.
Questions about the role of digital media technologies in shaping the social fabric and built form of urban life are all the more urgent in the context of challenges posed by rapid urbanization, a worldwide financial crisis that hits particularly hard on the architectural sector, socio–cultural shifts in the relationship between professional and amateur, the status of expert knowledge, societies that face increasingly complex ‘wicked’ problems, and governments retreating from public services. When grounds are shifting, urban design professionals as well as citizens need to reconsider their own role in city making.”