Claudia Padovani and Elena Pavan in the journal “Global Networks“: In this article, we address transformations in global governance brought about by information and communication technologies (ICTs). Focusing on the specific domain of ‘gender-oriented communication governance’, we investigate online interactions among different kinds of actors active in promoting gender equity in and through the media. By tracing and analysing online issue networks, we investigate which actors are capable of influencing the framing of issues and of structuring discursive practices. From the analysis, different forms of power emerge, reflecting diverse modes of engaging in online interactions, where actors can operate as network ‘programmers’, ‘mobilizers’, or ‘switchers’. Our case study suggests that, often, old ways of conceiving actors’ interactions accompany the implementation of new communication tools, while the availability of a pervasive networked infrastructure does not automatically translate into meaningful interactions among all relevant actors in a specific domain….(More)”
The era of development mutants
Guilo Quaggiotto at Nesta: “If you were looking for the cutting edge of the development sector, where would you go these days? You would probably look at startups like Premise who have predicted food trends 25 days faster than national statistics in Brazil, or GiveDirectly who are pushing the boundaries on evidence – from RCTs to new ways of mapping poverty – to fast track the adoption of cash transfers.
Or perhaps you might draw your attention to PetaJakarta who are experimenting with new responses to crises by harnessing human sensor networks. You might be tempted to consider Airbnb’s Disaster Response programme as an indicator of an emerging alternative infrastructure for disaster response (and perhaps raising questions about the political economy of this all).
And could Bitnation’s Refugee Emergency programme in response to the European refugee crisis be the possible precursor of future solutions for transnational issues – among the development sector’s hardest challenges? Are the business models of One Acre Fund, which provides services for smallholder farmers, or Floodtags, which analyses citizen data during floods for water and disaster managers, an indicator of future pathways to scale – that elusive development unicorn?
If you want to look at the future of procuring solutions for the development sector, should you be looking at initiatives like Citymart, which works with municipalities across the world to rethink traditional procurement and unleash the expertise and innovation capabilities of their citizens? By the same token, projects like Pathogen Box, Poverty Stoplight or Patient Innovation point to a brave new world where lead-user innovation and harnessing ‘sticky’ local knowledge becomes the norm, rather than the exception. You would also be forgiven for thinking that social movements across the world are the place to look for signs of future mechanisms for harnessing collective intelligence – Kawal Pamilu’s “citizen experts” self-organising around the Indonesian elections in 2014 is a textbook case study in this department.
The list could go on and on: welcome to the era of development mutants. While established players in the development sector are engrossed in soul-searching and their fitness for purpose is being scrutinised from all quarters, a whole new set of players is emerging, unfettered by legacy and borrowing from a variety of different disciplines. They point to a potentially different future – indeed, many potentially different futures – for the sector…..
But what if we wanted to invert this paradigm? How could we move from denial to fruitful collaboration with the ‘edgeryders’ of the development sector and accelerate its transformation?
Adopting new programming principles
Based on our experience working with development organisations, we believe that partnering with the mutants involves two types of shifts for traditional players: at the programmatic and the operational level. At the programmatic level, our work on the ground led us to articulate the following emerging principles:
-
Mapping what people have, not what they need: even though approaches like jugaad and positive deviance have been around for a long time, unfortunately the default starting point for many development projects is still mapping needs, not assets. Inverting this paradigm allows for potentially disruptive project design and partnerships to emerge. (Signs of the future: Patient Innovation, Edgeryders, Community Mirror, Premise)
-
Getting ready for multiple futures: When distributed across an organisation and not limited to a centralised function, the discipline of scanning the horizon for emergent solutions that contradict the dominant paradigm can help move beyond the denial phase and develop new interfaces to collaborate with the mutants. Here the link between analysis (to understand not only what is probable, but also what is possible) and action is critical – otherwise this remains purely an academic exercise. (Signs of the future: OpenCare, Improstuctures, Seeds of Good Anthropocene, Museum of the Future)
-
Running multiple parallel experiments: According to Dave Snowden, in order to intervene in a complex system “you need multiple parallel experiments and they should be based on different and competing theories/hypotheses”. Unfortunately, many development projects are still based on linear narratives and assumptions such as “if only we run an awareness raising campaign citizens will change their behaviour”. Turning linear narratives into hypotheses to be tested (without becoming religious on a specific approach) opens up the possibility to explore the solution landscape and collaborate with non-obvious partners that bring new approaches to the table. (Signs of the future: Chukua Hakua, GiveDirectly, Finnish PM’s Office of Experiments, Ideas42, Cognitive Edge)
-
Embracing obliquity: A deep, granular understanding of local assets and dynamics along with system mapping (see point 5 below) and pairing behavioural experts with development practitioners can help identify entry points for exploring new types of intervention based on obliquity principles. Mutants are often faster in adopting this approach and partnering with them is a way to bypass organisational inertia and explore nonlinear interventions. (Signs of the future: Sardex, social prescriptions, forensic architecture)
-
From projects to systems: development organisations genuinely interested in developing new partnerships need to make the shift from the project logic to system investments. This involves, among other things, shifting the focus from providing solutions to helping every actor in the system to develop a higher level of consciousness about the issues they are facing and to take better decisions over time. It also entails partnering with mutants to explore entirely new financial mechanisms. (Signs of the future: Lankelly Chase, Indonesia waste banks, Dark Matter Labs)
Adopting new interfaces for working with the mutants
Harvard Business School professor Carliss Baldwin argued that most bureaucracies these days have a ‘non-contractible’ problem: they don’t know where smart people are, or how to evaluate how good they are. Most importantly, most smart people don’t want to work for them because they find them either too callous, unrewarding or slow (or a combination of all of these)….(More)”
Selected Readings on Data and Humanitarian Response
By Prianka Srinivasan and Stefaan G. Verhulst *
The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data and humanitarian response was originally published in 2016.
Data, when used well in a trusted manner, allows humanitarian organizations to innovate how to respond to emergency events, including better coordination of post-disaster relief efforts, the ability to harness local knowledge to create more targeted relief strategies, and tools to predict and monitor disasters in real time. Consequently, in recent years both multinational groups and community-based advocates have begun to integrate data collection and evaluation strategies into their humanitarian operations, to better and more quickly respond to emergencies. However, this movement poses a number of challenges. Compared to the private sector, humanitarian organizations are often less equipped to successfully analyze and manage big data, which pose a number of risks related to the security of victims’ data. Furthermore, complex power dynamics which exist within humanitarian spaces may be further exacerbated through the introduction of new technologies and big data collection mechanisms. In the below we share:
- Selected Reading List (summaries and hyperlinks)
- Annotated Selected Reading List
- Additional Readings
Selected Reading List (summaries in alphabetical order)
Data and Humanitarian Response
- John Karlsrud – Peacekeeping 4.0: Harnessing the Potential of Big Data, Social Media, and Cyber Technologies – Recommends that UN peacekeeping initiatives should better integrate big data and new technologies into their operations, adopting a “Peacekeeping 4.0” for the modern world.
- Fancesco Mancini, International Peace Institute – New Technology and the prevention of Violence and Conflict – Explores the ways in which new tools available in communications technology can assist humanitarian workers in preventing violence and conflict.
- Patrick Meier – Digital Humanitarians- How Big Data is changing the face of humanitarian response – Profiles the emergence of ‘Digital Humanitarians’—humanitarian workers who are using big data, crowdsourcing and new technologies to transform the way societies respond to humanitarian disasters.
- Andrew Robertson and Steve Olson (USIP) – Using Data Sharing to Improve Coordination in Peacebuilding – Summarises the findings of a United States Institute of Peace workshop which investigated the use of data-sharing systems between government and non-government actors in conflict zones. It identifies some of the challenges and benefits of data-sharing in peacebuilding efforts.
- United Nations Independent Expert Advisory Group on a Data Revolution for Sustainable Development – A World That Counts, Mobilizing the Data Revolution – Compiled by a group of 20 international experts, this report proposes ways to improve data management and monitoring, whilst mitigating some of the risks data poses.
- Katie Whipkey and Andrej Verity – Guidance for Incorporating Big Data into Humanitarian Operations – Created as part of the Digital Humanitarian Network with the support of UN-OCHA, this is a manual for humanitarian organizations looking to strategically incorporate Big Data into their work.
Risks of Using Big Data in Humanitarian Context
- Kate Crawford and Megan Finn – The limits of crisis data: analytical and ethical challenges of using social and mobile data to understand disasters – Analyzes the use of big data techniques following a crisis event, arguing that a reliance of social and mobile data can lead to significant oversights and ethical concerns in the wake of humanitarian disasters.
- Katja Lindskov Jacobsen – Making design safe for citizens: A hidden history of humanitarian experimentation – Argues that the UNHCR’s use of iris recognition technology in 2002 and 2007 during the repatriation of Afghan refugees from Pakistan constitutes a case of “humanitarian experimentation.” It questions this sort of experimentation which compromises the security of refugees in the pursuit of safer technologies for the rest of the world.
- Responsible Data Forum – Responsible Data Reflection Stories: an Overview – compiles various stories sourced by the Responsible Data Forum blog relating to data challenges faced by advocacy organizations, and draws recommendations based on these cases.
- Kristin Bergtora Sandvik – The humanitarian cyberspace: shrinking space or an expanding frontier? – Provides a detailed account of the development of a “humanitarian cyberspace” and how information and communication technologies have been further integrated into humanitarian operations since the mid-1990s.
Annotated Selected Reading List (in alphabetical order)
Karlsrud, John. “Peacekeeping 4.0: Harnessing the Potential of Big Data, Social Media, and Cyber Technologies.” Cyberspace and International Relations, 2013. http://bit.ly/235Qb3e
- This chapter from the book “Cyberspace and International Relations” suggests that advances in big data give humanitarian organizations unprecedented opportunities to prevent and mitigate natural disasters and humanitarian crises. However, the sheer amount of unstructured data necessitates effective “data mining” strategies for multinational organizations to make the most use of this data.
- By profiling some civil-society organizations who use big data in their peacekeeping efforts, Karlsrud suggests that these community-focused initiatives are leading the movement toward analyzing and using big data in countries vulnerable to crisis.
- The chapter concludes by offering ten recommendations to UN peacekeeping forces to best realize the potential of big data and new technology in supporting their operations.
Mancini, Fancesco. “New Technology and the prevention of Violence and Conflict.” International Peace Institute, 2013. http://bit.ly/1ltLfNV
- This report from the International Peace Institute looks at five case studies to assess how information and communications technologies (ICTs) can help prevent humanitarian conflicts and violence. Their findings suggest that context has a significant impact on the ability for these ICTs for conflict prevention, and any strategies must take into account the specific contingencies of the region to be successful.
- The report suggests seven lessons gleaned from the five case studies:
- New technologies are just one in a variety of tools to combat violence. Consequently, organizations must investigate a variety of complementary strategies to prevent conflicts, and not simply rely on ICTs.
- Not every community or social group will have the same relationship to technology, and their ability to adopt new technologies are similarly influenced by their context. Therefore, a detailed needs assessment must take place before any new technologies are implemented.
- New technologies may be co-opted by violent groups seeking to maintain conflict in the region. Consequently, humanitarian groups must be sensitive to existing political actors and be aware of possible negative consequences these new technologies may spark.
- Local input is integral to support conflict prevention measures, and there exists need for collaboration and awareness-raising with communities to ensure new technologies are sustainable and effective.
- Information shared between civil-society has more potential to develop early-warning systems. This horizontal distribution of information can also allow communities to maintain the accountability of local leaders.
Meier, Patrick. “Digital humanitarians: how big data is changing the face of humanitarian response.” Crc Press, 2015. http://amzn.to/1RQ4ozc
- This book traces the emergence of “Digital Humanitarians”—people who harness new digital tools and technologies to support humanitarian action. Meier suggests that this has created a “nervous system” to connect people from disparate parts of the world, revolutionizing the way we respond to humanitarian crises.
- Meier argues that such technology is reconfiguring the structure of the humanitarian space, where victims are not simply passive recipients of aid but can contribute with other global citizens. This in turn makes us more humane and engaged people.
Robertson, Andrew and Olson, Steve. “Using Data Sharing to Improve Coordination in Peacebuilding.” United States Institute for Peace, 2012. http://bit.ly/235QuLm
- This report functions as an overview of a roundtable workshop on Technology, Science and Peace Building held at the United States Institute of Peace. The workshop aimed to investigate how data-sharing techniques can be developed for use in peace building or conflict management.
- Four main themes emerged from discussions during the workshop:
- “Data sharing requires working across a technology-culture divide”—Data sharing needs the foundation of a strong relationship, which can depend on sociocultural, rather than technological, factors.
- “Information sharing requires building and maintaining trust”—These relationships are often built on trust, which can include both technological and social perspectives.
- “Information sharing requires linking civilian-military policy discussions to technology”—Even when sophisticated data-sharing technologies exist, continuous engagement between different stakeholders is necessary. Therefore, procedures used to maintain civil-military engagement should be broadened to include technology.
- “Collaboration software needs to be aligned with user needs”—technology providers need to keep in mind the needs of its users, in this case peacebuilders, in order to ensure sustainability.
United Nations Independent Expert Advisory Group on a Data Revolution for Sustainable Development. “A World That Counts, Mobilizing the Data Revolution.” 2014. https://bit.ly/2Cb3lXq
- This report focuses on the potential benefits and risks data holds for sustainable development. Included in this is a strategic framework for using and managing data for humanitarian purposes. It describes a need for a multinational consensus to be developed to ensure data is shared effectively and efficiently.
- It suggests that “people who are counted”—i.e., those who are included in data collection processes—have better development outcomes and a better chance for humanitarian response in emergency or conflict situations.
Katie Whipkey and Andrej Verity. “Guidance for Incorporating Big Data into Humanitarian Operations.” Digital Humanitarian Network, 2015. http://bit.ly/1Y2BMkQ
- This report produced by the Digital Humanitarian Network provides an overview of big data, and how humanitarian organizations can integrate this technology into their humanitarian response. It primarily functions as a guide for organizations, and provides concise, brief outlines of what big data is, and how it can benefit humanitarian groups.
- The report puts forward four main benefits acquired through the use of big data by humanitarian organizations: 1) the ability to leverage real-time information; 2) the ability to make more informed decisions; 3) the ability to learn new insights; 4) the ability for organizations to be more prepared.
- It goes on to assess seven challenges big data poses for humanitarian organizations: 1) geography, and the unequal access to technology across regions; 2) the potential for user error when processing data; 3) limited technology; 4) questionable validity of data; 5) underdeveloped policies and ethics relating to data management; 6) limitations relating to staff knowledge.
Risks of Using Big Data in Humanitarian Context
Crawford, Kate, and Megan Finn. “The limits of crisis data: analytical and ethical challenges of using social and mobile data to understand disasters.” GeoJournal 80.4, 2015. http://bit.ly/1X0F7AI
- Crawford & Finn present a critical analysis of the use of big data in disaster management, taking a more skeptical tone to the data revolution facing humanitarian response.
- They argue that though social and mobile data analysis can yield important insights and tools in crisis events, it also presents a number of limitations which can lead to oversights being made by researchers or humanitarian response teams.
- Crawford & Finn explore the ethical concerns the use of big data in disaster events introduces, including issues of power, privacy, and consent.
- The paper concludes by recommending that critical data studies, such as those presented in the paper, be integrated into crisis event research in order to analyze some of the assumptions which underlie mobile and social data.
Jacobsen, Katja Lindskov (2010) Making design safe for citizens: A hidden history of humanitarian experimentation. Citizenship Studies 14.1: 89-103. http://bit.ly/1YaRTwG
- This paper explores the phenomenon of “humanitarian experimentation,” where victims of disaster or conflict are the subjects of experiments to test the application of technologies before they are administered in greater civilian populations.
- By analyzing the particular use of iris recognition technology during the repatriation of Afghan refugees to Pakistan in 2002 to 2007, Jacobsen suggests that this “humanitarian experimentation” compromises the security of already vulnerable refugees in order to better deliver biometric product to the rest of the world.
Responsible Data Forum. “Responsible Data Reflection Stories: An Overview.” http://bit.ly/1Rszrz1
- This piece from the Responsible Data forum is primarily a compilation of “war stories” which follow some of the challenges in using big data for social good. By drawing on these crowdsourced cases, the Forum also presents an overview which makes key recommendations to overcome some of the challenges associated with big data in humanitarian organizations.
- It finds that most of these challenges occur when organizations are ill-equipped to manage data and new technologies, or are unaware about how different groups interact in digital spaces in different ways.
Sandvik, Kristin Bergtora. “The humanitarian cyberspace: shrinking space or an expanding frontier?” Third World Quarterly 37:1, 17-32, 2016. http://bit.ly/1PIiACK
- This paper analyzes the shift toward more technology-driven humanitarian work, where humanitarian work increasingly takes place online in cyberspace, reshaping the definition and application of aid. This has occurred along with what many suggest is a shrinking of the humanitarian space.
- Sandvik provides three interpretations of this phenomena:
- First, traditional threats remain in the humanitarian space, which are both modified and reinforced by technology.
- Second, new threats are introduced by the increasing use of technology in humanitarianism, and consequently the humanitarian space may be broadening, not shrinking.
- Finally, if the shrinking humanitarian space theory holds, cyberspace offers one example of this, where the increasing use of digital technology to manage disasters leads to a contraction of space through the proliferation of remote services.
Additional Readings on Data and Humanitarian Response
- Kristin Bergtora Sandvik, et al. – Humanitarian technology: a critical research agenda. – Takes a critical look at the field of humanitarian technology, analyzing what challenges this poses to post-disaster and conflict environment.
- Kristin Bergtora Sandvik – “The Risks of Technological Innovation.” – Suggests that despite the evident benefits such technology presents, it can also undermine humanitarian action and lead to “catastrophic events” themselves needing a new type of humanitarian response.
- Ryan Burns – Rethinking big data in digital humanitarianism: practices, epistemologies, and social relations – Takes a critical look at the use of big data in humanitarian spaces, arguing that the advent of digital humanitarianism has profound political and social implications, and can in fact limit information available following a humanitarian crisis.
- Kate Crawford – Is Data a Danger to the Developing World? – Argues that it is not simply risks to privacy that data poses to developing countries, but suggests that “data discrimination” can affect even the basic human rights of individuals, and introduce problematic power hierarchies between those who can access data and those who cannot.
- Paul Currion – Eyes Wide Shut: The challenge of humanitarian biometrics – Examines the use of biometrics by humanitarian organizations and national governments, and suggests stronger accountability is needed to ensure data from marginalized groups remain protected.
- Yves-Alexandre de Montjoye, Jake Kendall and Cameron F. Kerry – Enabling Humanitarian Use of Mobile Phone Data – Analyzes how data from mobile communication can provide insights into the spread of infectious disease, and how such data can also compromise individual privacy.
- Michael F. Goodchild and Alan Glennon – Crowdsourcing geographic information for disaster response: a research frontier – Explores how though volunteered geographic data may be messy and unreliable, it can provide many benefits in emergency situations.
- Raphael Horler – Crowdsourcing in the Humanitarian Network – An Analysis of the Literature – A Bachelor thesis which explores the increasing use of crowdsourced data by organizations involved in disaster response, investigating some of the challenges such use of crowdsourcing poses.
- Gus Hosein and Carly Nyst – Aiding Surveillance – Suggests that the unregulated use of technologies and surveillance systems by humanitarian organizations create systems which pose serious threats to individuals’ rights, particularly their right to privacy.
- L. Jacobsen – The Politics of Humanitarian Technology: Good Intentions, Unintended Consequences and Insecurity – Raises concerns about the rise of data collection and digital technology in humanitarian aid organizations, arguing that its unquestioned prominence creates new structures of power and control, which remain hidden under the rubric of liberal humanitarianism.
- Mirca Madianou – Digital Inequality and Second-Order Disasters: Social Media in the Typhoon Haiyan Recovery – Taking the effects of Typhoon Haiyan as a key case study, this paper investigates how digital inequalities and an unequal access to data can exacerbate existing social inequalities in a post-disaster environment.
- Sean Martin McDonald – Ebola: A Big Data Disaster. Privacy, Property, and the Law of Disaster Experimentation – Analyzes the challenges and privacy risks of using unregulated data in public health coordination by taking the use of Call Detail Record (CDR) data during the Ebola crisis as a key case study.
- National Academy of Engineering – Sensing and Shaping Emerging Conflicts: Report of a Joint Workshop of the National Academy of Engineering and the United States Institute of Peace: Roundtable on Technology, Science, and Peacebuilding – Building on the overview report of the United States Institute of Peace workshop examines what opportunities new technologies and data sharing provides for humanitarian groups.
- Mary K.Pratt – Big Data’s role in humanitarian aid – A Computer World article which provides an overview of Big Data, and how it is improving the efficiency and efficacy of humanitarian response, especially in conflict zones.
- Bertrand Taithe Róisínand and Roger Mac Ginty – Data hubris? Humanitarian information systems and the mirage of technology – Specifically looks at visual technology and crisis mapping, and big data, and suggests that there exists an over-enthusiasm in these claims made on behalf of technologically advanced humanitarian information systems.
- Linnet Taylor – No place to hide? The ethics and analytics of tracking mobility using mobile phone data – Examines the ethical problems associated with the tracking of mobile phone data, especially in low or middle-income countries.
- UN Office for the Coordination of Humanitarian Affairs (UN-OCHA) – Big data and humanitarianism: 5 things you need to know – Briefly outlines five issues that face humanitarian organizations as they integrate big data into their operations.
- United Nations Global Pulse – Mapping the Risk-Utility Landscape of Mobile Data for Sustainable Development and Humanitarian Action – Reports on a Global Pulse project (done in partnership with Massachusetts Institute of Technology) which aimed to find how aggregated mobile data can be maximized to protect privacy and provide effective support to crisis response.
- The Wilson Center – Connecting Grassroots to Government for Disaster Management: Workshop Summary – Summarizes the key points drawn from a two day Wilson Center workshop, which investigated how new technologies could engage whole communities in disaster management.
* Thanks to: Kristen B. Sandvik; Zara Rahman; Jennifer Schulte; Sean McDonald; Paul Currion; Dinorah Cantú-Pedraza and the Responsible Data Listserve for valuable input.
Mapping a flood of new data
Rebecca Lipman at Economist Intelligence Unit Perspectives on “One city tweets to stay dry: From drones to old-fashioned phone calls, data come from many unlikely sources. In a disaster, such as a flood or earthquake, responders will take whatever information they can get to visualise the crisis and best direct their resources. Increasingly, cities prone to natural disasters are learning to better aid their citizens by empowering their local agencies and responders with sophisticated tools to cut through the large volume and velocity of disaster-related data and synthesise actionable information.
Consider the plight of the metro area of Jakarta, Indonesia, home to some 28m people, 13 rivers and 1,100 km of canals. With 40% of the city below sea level (and sinking), and regularly subject to extreme weather events including torrential downpours in monsoon season, Jakarta’s residents face far-too-frequent, life-threatening floods. Despite the unpredictability of flooding conditions, citizens have long taken a passive approach that depended on government entities to manage the response. But the information Jakarta’s responders had on the flooding conditions was patchy at best. So in the last few years, the government began to turn to the local population for help. It helped.
Today, Jakarta’s municipal government is relying on the web-based PetaJakarta.org project and a handful of other crowdsourcing mobile apps such as Qlue and CROP to collect data and respond to floods and other disasters. Through these programmes, crowdsourced, time-sensitive data derived from citizens’ social-media inputs have made it possible for city agencies to more precisely map the locations of rising floods and help the residents at risk. In January 2015, for example, the web-based Peta Jakarta received 5,209 reports on floods via tweets with detailed text and photos. Anytime there’s a flood, Peta Jakarta’s data from the tweets are mapped and updated every minute, and often cross-checked by Jakarta Disaster Management Agency (BPBD) officials through calls with community leaders to assess the information and guide responders.
But in any city Twitter is only one piece of a very large puzzle. …
Even with such life-and-death examples, government agencies remain deeply protective of data because of issues of security, data ownership and citizen privacy. They are also concerned about liability issues if incorrect data lead to an activity that has unsuccessful outcomes. These concerns encumber the combination of crowdsourced data with operational systems of record, and impede the fast progress needed in disaster situations….Download the case study here.”
Innovation Prizes in Practice and Theory
Paper by Michael J. Burstein and Fiona Murray: “Innovation prizes in reality are significantly different from innovation prizes in theory. The former are familiar from popular accounts of historical prizes like the Longitude Prize: the government offers a set amount for a solution to a known problem, like £20,000 for a method of calculating longitude at sea. The latter are modeled as compensation to inventors in return for donating their inventions to the public domain. Neither the economic literature nor the policy literature that led to the 2010 America COMPETES Reauthorization Act — which made prizes a prominent tool of government innovation policy — provides a satisfying justification for the use of prizes, nor does either literature address their operation. In this article, we address both of these problems. We use a case study of one canonical, high profile innovation prize — the Progressive Insurance Automotive X Prize — to explain how prizes function as institutional means to achieve exogenously defined innovation policy goals in the face of significant uncertainty and information asymmetries. Focusing on the structure and function of actual innovation prizes as an empirical matter enables us to make three theoretical contributions to the current understanding of prizes. First, we offer a stronger normative justification for prizes grounded in their status as a key institutional arrangement for solving a specified innovation problem. Second, we develop a model of innovation prize governance and then situate that model in the administrative state, as a species of “new governance” or “experimental” regulation. Third, we derive from those analyses a novel framework for choosing among prizes, patents, and grants, one in which the ultimate choice depends on a trade off between the efficacy and scalability of the institutional solution….(More)”
Opening Up Government: Citizen Innovation and New Modes of Collaboration
Chapter by Stefan Etzelstorfer, Thomas Gegenhuber and, Dennis Hilgers in Open Tourism: Open Innovation, Crowdsourcing and Co-Creation Challenging the Tourism Industry: “Companies use crowdsourcing to solve problems by using a widely dispersed and large group of individuals. Crowdsourcing and open innovation are not restricted to businesses. Governments also increasingly rely on open innovation principles to harness the expert knowledge of citizens and use citizens’ contributions to the public value creation process. While a large body of literature has examined the open government paradigm at the national level, we still know relatively little about how open government initiatives play out at the local level. Even less is known about whether open government initiatives may create positive spill overs, for example by having a trickle-down effect onto local tourism sectors. In this article, we present the City of Linz’s open government activities. More specifically, we review how the public administration implemented the interactive mapping and reporting application “Schau auf Linz“ (“Look at Linz“). Through our analysis of this case study, we show what role the local context and prior policies play in implementing open government initiatives on a local level. In addition, we discuss how this initiative, like others, leads to positive spill overs for the tourism sector….(More)”
Smarter State Case Studies
“Just as individuals use only part of their brainpower to solve most problems, governing institutions make far too little use of the skills and experience of those inside and outside of government with scientific credentials, practical skills, and ground-level street smarts. New data-rich tools—what The GovLab calls technologies of expertise—are making it possible to match the supply of citizen and civil servant talent to the demand for it in government to solve problems.
The Smarter State Case Studies examine how public institutions are using technologies of expertise, including:
Talent Bank – Professional, social and knowledge networks
Collaboration – Platforms for group work across silos
Project Platforms – Places for inviting new participants to work on projects
Toolkits – Repositories for shared content
Explore the design and key features of these novel platforms; how they are being implemented; the challenges encountered by both creators and users and the anticipated impact of these new ways of working.
The case studies can be found at http://www.thegovlab.org/smarterstate.html
To share a case study, please contact: [email protected]“
The Smart City and its Citizens
Paper by Carlo Francesco Capra on “Governance and Citizen Participation in Amsterdam Smart City…Smart cities are associated almost exclusively with modern technology and infrastructure. However, smart cities have the possibility to enhance the involvement and contribution of citizens to urban development. This work explores the role of governance as one of the factors influencing the participation of citizens in smart cities projects. Governance characteristics play a major role in explaining different typologies of citizen participation. Through a focus on Amsterdam Smart City program as a specific case study, this research examines the characteristics of governance that are present in the overall program and within a selected sample of projects, and how they relate to different typologies of citizen participation. The analysis and comprehension of governance characteristics plays a crucial role both for a better understanding and management of citizen participation, especially in complex settings where multiple actors are interacting….(More)”
Design-Led Innovation in the Public Sector
Manuel Sosa at INSEAD Knowledge: “When entering a government permit office, virtually everyone would prepare themselves for a certain amount of boredom and confusion. But resignation may well turn to surprise or even shock, if that office is Singapore’s Employment Pass Service Centre (EPSC), where foreign professionals go to receive their visa to work in the city-state. The ambience more closely resembles a luxury hotel lobby than a grim government agency, an impression reinforced by the roaming reception managers who greet arriving applicants, directing them to a waiting area with upholstered chairs and skyline views.
In a new case study, “Designing the Employment Pass Service Centre for the Ministry of Manpower, Singapore”, Prof. Michael Pich and I explore how even public organizations are beginning to use design to find and tap into innovation opportunities where few have thought to look. In the case of Singapore’s Ministry of Manpower (MOM), a design-led transformation of a single facility was the starting point of a drastic reconsideration of what a government agency could be.
Efficiency is not enough
Prior to opening the EPSC in July 2009, MOM’s Work Pass Division (WPD) had developed hyper-efficient methods to process work permits for foreign workers, who comprise approximately 40 percent of Singapore’s workforce. In fact, it was generally considered the most efficient department of its kind in the world. After 9/11, a mandatory-fingerprinting policy for white-collar workers was introduced, necessitating a standalone centre. The agency saw this as an opportunity to raise the efficiency bar even further.
Giving careful consideration to every aspect of the permit-granting process, the project team worked with a local vendor to overhaul the existing model. The proposal they ultimately presented to MOM assured almost unheard-of waiting times, as well as a more aesthetically pleasing look and feel….
Most public-sector organisations’ prickly interactions with the public can be explained with the simple fact that they lack competition. Government bodies are generally monopolies dispensing necessities, so on the whole they don’t feel compelled to agonise over their public face.
MOM and the Singapore government had a different idea. Aware that they were competing with other countries for top global talent, they recognised that the permit-granting process, in a very real sense, set the tone for foreign professionals’ entire experience of Singapore. Expats would be unlikely to remember precisely how long it took to get processed, but the quality of the service received would resonate in their minds and affect their impression of the country as a whole.
IDEO typically begins by concentrating on the user experience. In this case, in addition to observing and identifying what goes through the mind of a typical applicant during his or her journey in the existing system, the observation stage included talking to foreigners who were arriving in Singapore about their experience. IDEO discovered that professionals newly arrived in Singapore were embarking on an entirely new chapter of their lives, with all the expected stresses. The last thing they needed was more stress when receiving their permit. Hence, the EPSC entry hall is airy and free of clutter to create a sense of calm. The ESPC provides toys to keep kids entertained while their parents meet with agents and register for work passes. Visitors are always called by name, not number. Intimidating interview rooms were done away with in favour of open cabanas….In its initial customer satisfaction survey in 2010, the EPSC scored an average rating of 5.7 out of 6….(More)”
OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data
Paper by Taha A Kass-Hout et al in JAMIA: “The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs).
Materials and Methods: Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges.
Results:Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products
Conclusion: With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products…(More)”