Selected Readings on Data and Humanitarian Response


By Prianka Srinivasan and Stefaan G. Verhulst *

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data and humanitarian response was originally published in 2016.

Data, when used well in a trusted manner, allows humanitarian organizations to innovate how to respond to emergency events, including better coordination of post-disaster relief efforts, the ability to harness local knowledge to create more targeted relief strategies, and tools to predict and monitor disasters in real time. Consequently, in recent years both multinational groups and community-based advocates have begun to integrate data collection and evaluation strategies into their humanitarian operations, to better and more quickly respond to emergencies. However, this movement poses a number of challenges. Compared to the private sector, humanitarian organizations are often less equipped to successfully analyze and manage big data, which pose a number of risks related to the security of victims’ data. Furthermore, complex power dynamics which exist within humanitarian spaces may be further exacerbated through the introduction of new technologies and big data collection mechanisms. In the below we share:

  • Selected Reading List (summaries and hyperlinks)
  • Annotated Selected Reading List
  • Additional Readings

Selected Reading List  (summaries in alphabetical order)

Data and Humanitarian Response

Risks of Using Big Data in Humanitarian Context

Annotated Selected Reading List (in alphabetical order)

Karlsrud, John. “Peacekeeping 4.0: Harnessing the Potential of Big Data, Social Media, and Cyber Technologies.” Cyberspace and International Relations, 2013. http://bit.ly/235Qb3e

  • This chapter from the book “Cyberspace and International Relations” suggests that advances in big data give humanitarian organizations unprecedented opportunities to prevent and mitigate natural disasters and humanitarian crises. However, the sheer amount of unstructured data necessitates effective “data mining” strategies for multinational organizations to make the most use of this data.
  • By profiling some civil-society organizations who use big data in their peacekeeping efforts, Karlsrud suggests that these community-focused initiatives are leading the movement toward analyzing and using big data in countries vulnerable to crisis.
  • The chapter concludes by offering ten recommendations to UN peacekeeping forces to best realize the potential of big data and new technology in supporting their operations.

Mancini, Fancesco. “New Technology and the prevention of Violence and Conflict.” International Peace Institute, 2013. http://bit.ly/1ltLfNV

  • This report from the International Peace Institute looks at five case studies to assess how information and communications technologies (ICTs) can help prevent humanitarian conflicts and violence. Their findings suggest that context has a significant impact on the ability for these ICTs for conflict prevention, and any strategies must take into account the specific contingencies of the region to be successful.
  • The report suggests seven lessons gleaned from the five case studies:
    • New technologies are just one in a variety of tools to combat violence. Consequently, organizations must investigate a variety of complementary strategies to prevent conflicts, and not simply rely on ICTs.
    • Not every community or social group will have the same relationship to technology, and their ability to adopt new technologies are similarly influenced by their context. Therefore, a detailed needs assessment must take place before any new technologies are implemented.
    • New technologies may be co-opted by violent groups seeking to maintain conflict in the region. Consequently, humanitarian groups must be sensitive to existing political actors and be aware of possible negative consequences these new technologies may spark.
    • Local input is integral to support conflict prevention measures, and there exists need for collaboration and awareness-raising with communities to ensure new technologies are sustainable and effective.
    • Information shared between civil-society has more potential to develop early-warning systems. This horizontal distribution of information can also allow communities to maintain the accountability of local leaders.

Meier, Patrick. “Digital humanitarians: how big data is changing the face of humanitarian response.” Crc Press, 2015. http://amzn.to/1RQ4ozc

  • This book traces the emergence of “Digital Humanitarians”—people who harness new digital tools and technologies to support humanitarian action. Meier suggests that this has created a “nervous system” to connect people from disparate parts of the world, revolutionizing the way we respond to humanitarian crises.
  • Meier argues that such technology is reconfiguring the structure of the humanitarian space, where victims are not simply passive recipients of aid but can contribute with other global citizens. This in turn makes us more humane and engaged people.

Robertson, Andrew and Olson, Steve. “Using Data Sharing to Improve Coordination in Peacebuilding.” United States Institute for Peace, 2012. http://bit.ly/235QuLm

  • This report functions as an overview of a roundtable workshop on Technology, Science and Peace Building held at the United States Institute of Peace. The workshop aimed to investigate how data-sharing techniques can be developed for use in peace building or conflict management.
  • Four main themes emerged from discussions during the workshop:
    • “Data sharing requires working across a technology-culture divide”—Data sharing needs the foundation of a strong relationship, which can depend on sociocultural, rather than technological, factors.
    • “Information sharing requires building and maintaining trust”—These relationships are often built on trust, which can include both technological and social perspectives.
    • “Information sharing requires linking civilian-military policy discussions to technology”—Even when sophisticated data-sharing technologies exist, continuous engagement between different stakeholders is necessary. Therefore, procedures used to maintain civil-military engagement should be broadened to include technology.
    • “Collaboration software needs to be aligned with user needs”—technology providers need to keep in mind the needs of its users, in this case peacebuilders, in order to ensure sustainability.

United Nations Independent Expert Advisory Group on a Data Revolution for Sustainable Development. “A World That Counts, Mobilizing the Data Revolution.” 2014. https://bit.ly/2Cb3lXq

  • This report focuses on the potential benefits and risks data holds for sustainable development. Included in this is a strategic framework for using and managing data for humanitarian purposes. It describes a need for a multinational consensus to be developed to ensure data is shared effectively and efficiently.
  • It suggests that “people who are counted”—i.e., those who are included in data collection processes—have better development outcomes and a better chance for humanitarian response in emergency or conflict situations.

Katie Whipkey and Andrej Verity. “Guidance for Incorporating Big Data into Humanitarian Operations.” Digital Humanitarian Network, 2015. http://bit.ly/1Y2BMkQ

  • This report produced by the Digital Humanitarian Network provides an overview of big data, and how humanitarian organizations can integrate this technology into their humanitarian response. It primarily functions as a guide for organizations, and provides concise, brief outlines of what big data is, and how it can benefit humanitarian groups.
  • The report puts forward four main benefits acquired through the use of big data by humanitarian organizations: 1) the ability to leverage real-time information; 2) the ability to make more informed decisions; 3) the ability to learn new insights; 4) the ability for organizations to be more prepared.
  • It goes on to assess seven challenges big data poses for humanitarian organizations: 1) geography, and the unequal access to technology across regions; 2) the potential for user error when processing data; 3) limited technology; 4) questionable validity of data; 5) underdeveloped policies and ethics relating to data management; 6) limitations relating to staff knowledge.

Risks of Using Big Data in Humanitarian Context
Crawford, Kate, and Megan Finn. “The limits of crisis data: analytical and ethical challenges of using social and mobile data to understand disasters.” GeoJournal 80.4, 2015. http://bit.ly/1X0F7AI

  • Crawford & Finn present a critical analysis of the use of big data in disaster management, taking a more skeptical tone to the data revolution facing humanitarian response.
  • They argue that though social and mobile data analysis can yield important insights and tools in crisis events, it also presents a number of limitations which can lead to oversights being made by researchers or humanitarian response teams.
  • Crawford & Finn explore the ethical concerns the use of big data in disaster events introduces, including issues of power, privacy, and consent.
  • The paper concludes by recommending that critical data studies, such as those presented in the paper, be integrated into crisis event research in order to analyze some of the assumptions which underlie mobile and social data.

Jacobsen, Katja Lindskov (2010) Making design safe for citizens: A hidden history of humanitarian experimentation. Citizenship Studies 14.1: 89-103. http://bit.ly/1YaRTwG

  • This paper explores the phenomenon of “humanitarian experimentation,” where victims of disaster or conflict are the subjects of experiments to test the application of technologies before they are administered in greater civilian populations.
  • By analyzing the particular use of iris recognition technology during the repatriation of Afghan refugees to Pakistan in 2002 to 2007, Jacobsen suggests that this “humanitarian experimentation” compromises the security of already vulnerable refugees in order to better deliver biometric product to the rest of the world.

Responsible Data Forum. “Responsible Data Reflection Stories: An Overview.” http://bit.ly/1Rszrz1

  • This piece from the Responsible Data forum is primarily a compilation of “war stories” which follow some of the challenges in using big data for social good. By drawing on these crowdsourced cases, the Forum also presents an overview which makes key recommendations to overcome some of the challenges associated with big data in humanitarian organizations.
  • It finds that most of these challenges occur when organizations are ill-equipped to manage data and new technologies, or are unaware about how different groups interact in digital spaces in different ways.

Sandvik, Kristin Bergtora. “The humanitarian cyberspace: shrinking space or an expanding frontier?” Third World Quarterly 37:1, 17-32, 2016. http://bit.ly/1PIiACK

  • This paper analyzes the shift toward more technology-driven humanitarian work, where humanitarian work increasingly takes place online in cyberspace, reshaping the definition and application of aid. This has occurred along with what many suggest is a shrinking of the humanitarian space.
  • Sandvik provides three interpretations of this phenomena:
    • First, traditional threats remain in the humanitarian space, which are both modified and reinforced by technology.
    • Second, new threats are introduced by the increasing use of technology in humanitarianism, and consequently the humanitarian space may be broadening, not shrinking.
    • Finally, if the shrinking humanitarian space theory holds, cyberspace offers one example of this, where the increasing use of digital technology to manage disasters leads to a contraction of space through the proliferation of remote services.

Additional Readings on Data and Humanitarian Response

* Thanks to: Kristen B. Sandvik; Zara Rahman; Jennifer Schulte; Sean McDonald; Paul Currion; Dinorah Cantú-Pedraza and the Responsible Data Listserve for valuable input.

Mapping a flood of new data


Rebecca Lipman at Economist Intelligence Unit Perspectives on “One city tweets to stay dry: From drones to old-fashioned phone calls, data come from many unlikely sources. In a disaster, such as a flood or earthquake, responders will take whatever information they can get to visualise the crisis and best direct their resources. Increasingly, cities prone to natural disasters are learning to better aid their citizens by empowering their local agencies and responders with sophisticated tools to cut through the large volume and velocity of disaster-related data and synthesise actionable information.

Consider the plight of the metro area of Jakarta, Indonesia, home to some 28m people, 13 rivers and 1,100 km of canals. With 40% of the city below sea level (and sinking), and regularly subject to extreme weather events including torrential downpours in monsoon season, Jakarta’s residents face far-too-frequent, life-threatening floods. Despite the unpredictability of flooding conditions, citizens have long taken a passive approach that depended on government entities to manage the response. But the information Jakarta’s responders had on the flooding conditions was patchy at best. So in the last few years, the government began to turn to the local population for help. It helped.

Today, Jakarta’s municipal government is relying on the web-based PetaJakarta.org project and a handful of other crowdsourcing mobile apps such as Qlue and CROP to collect data and respond to floods and other disasters. Through these programmes, crowdsourced, time-sensitive data derived from citizens’ social-media inputs have made it possible for city agencies to more precisely map the locations of rising floods and help the residents at risk. In January 2015, for example, the web-based Peta Jakarta received 5,209 reports on floods via tweets with detailed text and photos. Anytime there’s a flood, Peta Jakarta’s data from the tweets are mapped and updated every minute, and often cross-checked by Jakarta Disaster Management Agency (BPBD) officials through calls with community leaders to assess the information and guide responders.

But in any city Twitter is only one piece of a very large puzzle. …

Even with such life-and-death examples, government agencies remain deeply protective of data because of issues of security, data ownership and citizen privacy. They are also concerned about liability issues if incorrect data lead to an activity that has unsuccessful outcomes. These concerns encumber the combination of crowdsourced data with operational systems of record, and impede the fast progress needed in disaster situations….Download the case study .”

Innovation Prizes in Practice and Theory


Paper by Michael J. Burstein and Fiona Murray: “Innovation prizes in reality are significantly different from innovation prizes in theory. The former are familiar from popular accounts of historical prizes like the Longitude Prize: the government offers a set amount for a solution to a known problem, like £20,000 for a method of calculating longitude at sea. The latter are modeled as compensation to inventors in return for donating their inventions to the public domain. Neither the economic literature nor the policy literature that led to the 2010 America COMPETES Reauthorization Act — which made prizes a prominent tool of government innovation policy — provides a satisfying justification for the use of prizes, nor does either literature address their operation. In this article, we address both of these problems. We use a case study of one canonical, high profile innovation prize — the Progressive Insurance Automotive X Prize — to explain how prizes function as institutional means to achieve exogenously defined innovation policy goals in the face of significant uncertainty and information asymmetries. Focusing on the structure and function of actual innovation prizes as an empirical matter enables us to make three theoretical contributions to the current understanding of prizes. First, we offer a stronger normative justification for prizes grounded in their status as a key institutional arrangement for solving a specified innovation problem. Second, we develop a model of innovation prize governance and then situate that model in the administrative state, as a species of “new governance” or “experimental” regulation. Third, we derive from those analyses a novel framework for choosing among prizes, patents, and grants, one in which the ultimate choice depends on a trade off between the efficacy and scalability of the institutional solution….(More)”

Opening Up Government: Citizen Innovation and New Modes of Collaboration


Chapter by Stefan Etzelstorfer, Thomas Gegenhuber and, Dennis Hilgers in Open Tourism: Open Innovation, Crowdsourcing and Co-Creation Challenging the Tourism Industry: “Companies use crowdsourcing to solve problems by using a widely dispersed and large group of individuals. Crowdsourcing and open innovation are not restricted to businesses. Governments also increasingly rely on open innovation principles to harness the expert knowledge of citizens and use citizens’ contributions to the public value creation process. While a large body of literature has examined the open government paradigm at the national level, we still know relatively little about how open government initiatives play out at the local level. Even less is known about whether open government initiatives may create positive spill overs, for example by having a trickle-down effect onto local tourism sectors. In this article, we present the City of Linz’s open government activities. More specifically, we review how the public administration implemented the interactive mapping and reporting application “Schau auf Linz“ (“Look at Linz“). Through our analysis of this case study, we show what role the local context and prior policies play in implementing open government initiatives on a local level. In addition, we discuss how this initiative, like others, leads to positive spill overs for the tourism sector….(More)”

Smarter State Case Studies


“Just as individuals use only part of their brainpower to solve most problems, governing institutions make far too little use of the skills and experience of those inside and outside of government with scientific credentials, practical skills, and ground-level street smarts. New data-rich tools—what The GovLab calls technologies of expertise—are making it possible to match the supply of citizen and civil servant talent to the demand for it in government to solve problems.

The Smarter State Case Studies examine how public institutions are using technologies of expertise, including:

Talent Bank – Professional, social and knowledge networks
Collaboration – Platforms for group work across silos
Project Platforms – Places for inviting new participants to work on projects
Toolkits – Repositories for shared content

Explore the design and key features of these novel platforms; how they are being implemented; the challenges encountered by both creators and users and the anticipated impact of these new ways of working.
The case studies can be found at http://www.thegovlab.org/smarterstate.html
To share a case study, please contact: [email protected]

The Smart City and its Citizens


Paper by Carlo Francesco Capra on “Governance and Citizen Participation in Amsterdam Smart City…Smart cities are associated almost exclusively with modern technology and infrastructure. However, smart cities have the possibility to enhance the involvement and contribution of citizens to urban development. This work explores the role of governance as one of the factors influencing the participation of citizens in smart cities projects. Governance characteristics play a major role in explaining different typologies of citizen participation. Through a focus on Amsterdam Smart City program as a specific case study, this research examines the characteristics of governance that are present in the overall program and within a selected sample of projects, and how they relate to different typologies of citizen participation. The analysis and comprehension of governance characteristics plays a crucial role both for a better understanding and management of citizen participation, especially in complex settings where multiple actors are interacting….(More)”

Design-Led Innovation in the Public Sector


Manuel Sosa at INSEAD Knowledge: “When entering a government permit office, virtually everyone would prepare themselves for a certain amount of boredom and confusion. But resignation may well turn to surprise or even shock, if that office is Singapore’s Employment Pass Service Centre (EPSC), where foreign professionals go to receive their visa to work in the city-state. The ambience more closely resembles a luxury hotel lobby than a grim government agency, an impression reinforced by the roaming reception managers who greet arriving applicants, directing them to a waiting area with upholstered chairs and skyline views.

In a new case study, “Designing the Employment Pass Service Centre for the Ministry of Manpower, Singapore”, Prof. Michael Pich and I explore how even public organizations are beginning to use design to find and tap into innovation opportunities where few have thought to look. In the case of Singapore’s Ministry of Manpower (MOM), a design-led transformation of a single facility was the starting point of a drastic reconsideration of what a government agency could be.

Efficiency is not enough

Prior to opening the EPSC in July 2009, MOM’s Work Pass Division (WPD) had developed hyper-efficient methods to process work permits for foreign workers, who comprise approximately 40 percent of Singapore’s workforce. In fact, it was generally considered the most efficient department of its kind in the world. After 9/11, a mandatory-fingerprinting policy for white-collar workers was introduced, necessitating a standalone centre. The agency saw this as an opportunity to raise the efficiency bar even further.

Giving careful consideration to every aspect of the permit-granting process, the project team worked with a local vendor to overhaul the existing model. The proposal they ultimately presented to MOM assured almost unheard-of waiting times, as well as a more aesthetically pleasing look and feel….

Most public-sector organisations’ prickly interactions with the public can be explained with the simple fact that they lack competition. Government bodies are generally monopolies dispensing necessities, so on the whole they don’t feel compelled to agonise over their public face.

MOM and the Singapore government had a different idea. Aware that they were competing with other countries for top global talent, they recognised that the permit-granting process, in a very real sense, set the tone for foreign professionals’ entire experience of Singapore. Expats would be unlikely to remember precisely how long it took to get processed, but the quality of the service received would resonate in their minds and affect their impression of the country as a whole.

IDEO typically begins by concentrating on the user experience. In this case, in addition to observing and identifying what goes through the mind of a typical applicant during his or her journey in the existing system, the observation stage included talking to foreigners who were arriving in Singapore about their experience. IDEO discovered that professionals newly arrived in Singapore were embarking on an entirely new chapter of their lives, with all the expected stresses. The last thing they needed was more stress when receiving their permit. Hence, the EPSC entry hall is airy and free of clutter to create a sense of calm. The ESPC provides toys to keep kids entertained while their parents meet with agents and register for work passes. Visitors are always called by name, not number. Intimidating interview rooms were done away with in favour of open cabanas….In its initial customer satisfaction survey in 2010, the EPSC scored an average rating of 5.7 out of 6….(More)”

OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data


Paper by Taha A Kass-Hout et al in JAMIA: “The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs).

Materials and Methods: Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges.

Results:Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products

Conclusion: With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products…(More)”

Technology: A Planning Guide for Political Parties


NDI launched a new website to serve as a resource for political parties that want to use technology to improve the way they function. “Technology: A Planning Guide for Political Parties” gives advice on common pitfalls political parties experience and potential pathways for success when implementing technology projects. The website also gives real-life examples through a set of case studies to help political parties learn from the experiences of other political parties and campaigns.

The case study Online Primary to Increase Participation Fails to Connect looks at how the European Green Party (EGP) announced an open online primary election ahead of the 2014 European elections. The online primary gave citizens three months to electronically vote for two of the four nominated candidates running for the European Parliament. According to Reinhard Bütikofer, co-chair of the EGP and member of the European Parliament, by implementing the online primary, the party “wanted to reduce the growing gap between citizens and political institutions.”

The EGP was hoping to mobilize 100,000 EU citizens to vote in the primaries, but as the elections came and went only 22,000 people participated. The low voter turnout underscored that the success of a new technology projects depend on more than just functional technology. It also requires deep contextual analysis, and strategic planning to assess user interest and clarify the level of marketing needed to encourage participation…..NDI’s technology guide includes step-by-step instructions on how parties can think through what ICT projects can achieve and how they can best utilize them. It also includes worksheets to help parties better understand the decisions they may have to make including: custom versus off-the-shelf software, basic voter file requirements, how to calculate the real long- and short-term costs of an ICT project and more…..For more information, please see this short overview of the site.”

Open Data as Open Educational Resources: Case studies of emerging practice


Book edited by Javiera Atenas and Leo Havemann: “…is the outcome of a collective effort that has its origins in the 5th Open Knowledge Open Education Working Group call, in which the idea of using Open Data in schools was mentioned. It occurred to us that Open Data and open educational resources seemed to us almost to exist in separate open worlds.

We decided to seek out evidence in the use of open data as OER, initially by conducting a bibliographical search. As we could not find published evidence, we decided to ask educators if they were in fact, using open data in this way, and wrote a post for this blog (with Ernesto Priego) explaining our perspective, called The 21st Century’s Raw Material: Using Open Data as Open Educational Resources. We ended the post with a link to an exploratory survey, the results of which indicated a need for more awareness of the existence and potential value of Open Data amongst educators…..

the case studies themselves. They have been provided by scholars and practitioners from different disciplines and countries, and they reflect different approaches to the use of open data. The first case study presents an approach to educating both teachers and students in the use of open data for civil monitoring via Scuola di OpenCoesione in Italy, and has been written by Chiara Ciociola and Luigi Reggi. The second case, by Tim Coughlan from the Open University, UK, showcases practical applications in the use of local and contextualised open data for the development of apps. The third case, written by Katie Shamash, Juan Pablo Alperin & Alessandra Bordini from Simon Fraser University, Canada, demonstrates how publishing students can engage, through data analysis, in very current debates around scholarly communications and be encouraged to publish their own findings. The fourth case by Alan Dix from Talis and University of Birmingham, UK, and Geoffrey Ellis from University of Konstanz, Germany, is unique because the data discussed in this case is self-produced, indeed ‘quantified self’ data, which was used with students as material for class discussion and, separately, as source data for another student’s dissertation project. Finally, the fifth case, presented by Virginia Power from University of the West of England, UK, examines strategies to develop data and statistical literacies in future librarians and knowledge managers, aiming to support and extend their theoretical understanding of the concept of the ‘knowledge society’ through the use of Open Data….(More)

The book can be downloaded here Open Data as Open Educational Resources