Mapping community resources for disaster preparedness: humanitarian data capability and automated futures


Report by Anthony McCosker et al: “This report details the rationale, background research and design for a platform to help local communities map resources for disaster preparedness. It sets out a first step in improving community data capability through resource mapping to enhance humanitarian action before disaster events occur.The project seeks to enable local community disaster preparedness and thus build community resilience by improving the quality of data about community strengths, resources and assets.

In this report, the authors define a gap in existing humanitarian mapping approaches and the uses of open, public and social media data in humanitarian contexts. The report surveys current knowledge and present a selection of case studies delivering data and humanitarian mapping in local communities.

Drawing on this knowledge and practice review and stakeholder workshops throughout 2021, the authors also define a method and toolkit for the effective use of community assets data…(More)”

Big, Open Data for Development: A Vision for India 


Paper by Sam Asher, Aditi Bhowmick, Alison Campion, Tobias Lunt and Paul Novosad: “The government generates terabytes of data directly and incidentally in the operation of public programs. For intrinsic and instrumental reasons, these data should be made open to the public. Intrinsically, a right to government data is implicit in the right to information. Instrumentally, open government data will improve policy, increase accountability, empower citizens, create new opportunities for private firms, and lead to development and economic growth. A series of case studies demonstrates these benefits in a range of other contexts. We next examine how government can maximize social benefit from government data. This entails opening administrative data as far upstream in the data pipeline as possible. Most administrative data can be minimally aggregated to protect privacy, while providing data with high geographic granularity. We assess the status quo of the Government of India’s data production and dissemination pipeline, and find that the greatest weakness lies in the last mile: making government data accessible to the public. This means more than posting it online; we describe a set of principles for lowering the access and use costs close to zero. Finally, we examine the use of government data to guide policy in the COVID-19 pandemic. Civil society played a key role in aggregating, disseminating, and analyzing government data, providing analysis that was essential to policy response. However, key pieces of data, like testing rates and seroprevalence distribution, were unnecessarily withheld by the government, data which could have substantially improved the policy response. A more open approach to government data would have saved many lives…(More)”.

Can open-source technologies support open societies?


Report by Victoria Welborn, and George Ingram: “In the 2020 “Roadmap for Digital Cooperation,” U.N. Secretary General António Guterres highlighted digital public goods (DPGs) as a key lever in maximizing the full potential of digital technology to accelerate progress toward the Sustainable Development Goals (SDGs) while also helping overcome some of its persistent challenges. 

The Roadmap rightly pointed to the fact that, as with any new technology, there are risks around digital technologies that might be counterproductive to fostering prosperous, inclusive, and resilient societies. In fact, without intentional action by the global community, digital technologies may more naturally exacerbate exclusion and inequality by undermining trust in critical institutions, allowing consolidation of control and economic value by the powerful, and eroding social norms through breaches of privacy and disinformation campaigns. 

Just as the pandemic has served to highlight the opportunity for digital technologies to reimagine and expand the reach of government service delivery, so too has it surfaced specific risks that are hallmarks of closed societies and authoritarian states—creating new pathways to government surveillance, reinforcing existing socioeconomic inequalities, and enabling the rapid proliferation of disinformation. Why then—in the face of these real risks—focus on the role of digital public goods in development?

As the Roadmap noted, DPGs are “open source software, open data, open AI models, open standards and open content that adhere to privacy and other applicable laws and best practices, do no harm, and help attain the SDGs.”[1] There are a number of factors why such products have unique potential to accelerate development efforts, including widely recognized benefits related to more efficient and cost effective implementation of technology-enabled development programming. 

Historically, the use of digital solutions for development in low- and middle-income countries (LMICs) has been supported by donor investments in sector-specific technology systems, reinforcing existing silos and leaving countries with costly, proprietary software solutions with duplicative functionality and little interoperability across government agencies, much less underpinning private sector innovation. These silos are further codified through the development of sector-specific maturity models and metrics. An effective DPG ecosystem has the potential to enable the reuse and improvement of existing tools, thereby lowering overall cost of deploying technology solutions and increasing efficient implementation.

Beyond this proven reusability of DPGs and the associated cost and deployment efficiencies, do DPGs have even more transformational potential? Increasingly, there is interest in DPGs as drivers of inclusion and products through which to standardize and safeguard rights; these opportunities are less understood and remain unproven. To begin to fill that gap, this paper first examines the unique value proposition of DPGs in supporting open societies by advancing more equitable systems and by codifying rights. The paper then considers the persistent challenges to more fully realizing this opportunity and offers some recommendations for how to address these challenges…(More)”.

What is the value of data? A review of empirical methods


Policy brief by Diane Coyle and Annabel Manley: “The economy has been transformed by data in recent years. Data-driven firms made up seven of the global top 10 firms by stock market capitalisation in 2021; and across the OECD (Organisation for Economic Co-operation and Development) economies there has been a growing gap in terms of productivity and profitability between firms that use data intensively and the rest (e.g. Brynjolfsson et al 2019; Bajgar et al 2022; Coyle et al 2022). The widespread availability of data and analytics has also begun to extend into the public sector and policymaking, for example with ‘following the science’ – implying intense use of data – becoming a tagline for the handling of the COVID-19 pandemic in the UK and elsewhere.

It is therefore obvious that data has value in an economically meaningful sense. The sources of its value and characteristics of data as an economic asset are discussed at length in our earlier Value of Data report (Coyle et al 2020a). We concluded that there is potential value to the economy as a whole from having the ability to use data, and not just to the organisations that control specific data sets. This appreciation is increasingly reflected in many policy statements of data strategy and the broader debate about the governance of data (e.g. European Parliament 2022). The value of data is also explicitly and implicitly acknowledged by firms that sell data services, and investors who take dataset assets into account in stock market valuations or mergers and acquisitions.

However, despite the broad recognition of its value, and the need to develop appropriate policy frameworks, there is still no consensus method for empirically determining the value of data. Without this, the full potential will not be realised (Verhulst 2018). There are not even many examples of markets for data that would indicate a private valuation (although not the wider social value). Yet estimates of the value of data are needed to determine an appropriate level of investment, as well as a better understanding of how data can contribute value to the economy and how to govern the collection and use of different types of data.

This brief presents an overview of a range of alternative methods for data valuation, including those proposed in the existing literature. This includes some relatively widely used methods and others that are more specialist or preliminary…(More)”.

Localising AI for crisis response


Report by Aleks Berditchevskaia and Kathy Peach, Isabel Stewart: “Putting power back in the hands of frontline humanitarians and local communities.

This report documents the results of a year-long project to design and evaluate new proof-of-concept Collective Crisis Intelligence tools. These are tools that combine data from crisis-affected communities with the processing power of AI to improve humanitarian action.

The two collective crisis intelligence tool prototypes developed were:

  • NFRI-Predict: a tool that predicts which non-food aid items (NFRI) are most needed by different types of households in different regions of Nepal after a crisis.
  • Report and Respond: a French language SMS-based tool that allows Red Cross volunteers in Cameroon to check the accuracy of COVID-19 rumours or misinformation they hear from the community while they’re in the field, and receive real-time guidance on appropriate responses.

Both tools were developed using Nesta’s Participatory AI methods, which aimed to address some of the risks associated with humanitarian AI by involving local communities in the design, development and evaluation of the new tools.

The project was a partnership between Nesta’s Centre for Collective Intelligence Design (CCID) and Data Analytics Practice (DAP), the Nepal Red Cross and Cameroon Red Cross, IFRC Solferino Academy, and Open Lab Newcastle University, and it was funded by the UK Humanitarian Innovation Hub.

We found that collective crisis intelligence:

  • has the potential to make local humanitarian action more timely and appropriate to local needs.
  • can transform locally-generated data to drive new forms of (anticipatory) action.

We found that participatory AI:

  • can overcome several critiques and limitations of AI – as well as helping to improve model performance.
  • helps to surface tensions between the assumptions and standards set by AI gatekeepers versus the pragmatic reality of implementation.
  • creates opportunities for building and sharing new capabilities among frontline staff and data scientists.

We also validated that collective crisis intelligence and participatory AI can help increase trust in AI tools, but more research is needed to untangle the factors that were responsible…(More)”.

Protecting Children in Cyberconflicts


Paper by Eleonore Pauwels: “Just as digital technologies have transformed myriad aspects of daily life, they are now transforming war, politics and the social fabric.

This rapid analysis examines the ways in which cyberconflict adversely affects children and offers actions that could strengthen safeguards to protect them.

Cyberconflict can impact children directly or indirectly. Harms range from direct targeting for influence and recruitment into armed forces and armed groups, to personal data manipulation and theft, to cyber attacks on infrastructure across sectors critical to child well-being such as education and health facilities.

Many experts believe that the combination of existing international humanitarian law, international criminal law, human rights law, and child rights law is adequate to address the emerging issues posed by cyberconflict. Nevertheless, several key challenges persist. Attribution of cyber attacks to specific actors and ensuring accountability has proven challenging, particularly in the so-called grey zone between war and peace.

There is an urgent need to clarify how child rights apply in the digital space and for Member States to place these rights at the centre of regulatory frameworks and legislation on new technologies…(More)”.

Use of Data in Public Sector Human Resources and Workforce Management: Solutions and Challenges


White Paper by Katherine Barrett and Richard Greene: “Across the U.S., a growing number of cities, counties, and states are using data across agencies to improve management and make decisions—and HR and payroll professionals in particular stand to gain much from this data to help drive staffing and other strategic decisions. In this white paper, industry experts Katherine Barrett and Richard Greene take a deep dive into both the benefits and challenges of using data with real-life examples of how data has been instrumental in building a resilient HR apparatus.

Data can be used for positive change that includes shorter new-hire onboarding, fairer overtime distribution, and even improved employee safety. However, obstacles to using data in an optimal way to improve HR management, such as insufficient funding, lack of training, and lack of software access, can keep government organizations from making the most of all it can offer.

Despite barriers, many organizations are moving toward creating a culture that is conducive to the use of the data their computers can create. Examples of how data and data analysis can transform workforce management practices include:

  • Studying existing hiring and onboarding data to facilitate more effective and efficient administration
  • Tracking turnover data to document employee departures and reveal information about those most at risk of sudden departure
  • Reducing overtime by using the data to ensure fairer distribution of overtime
  • Uncovering equity issues by assessing and comparing the demographic makeup of a workforce to see how closely it matches their population…(More)”

Energy Data Sharing: The Case of EV Smart Charging


Paper by Sean Ennis and Giuseppe Colangelo: “The green and digital transitions are concomitantly underway. In its upcoming Action Plan on Digitalisation of Energy, the European Commission aims to develop a digital-driven “European energy data space” to allow for data sharing and system integration between the energy sector and other sectors, e.g. mobility.

CERRE  has begun working at the intersection of digital and energy with a new, cross-sector research initiative aimed at identifying the business case and governance principles for the development of a European energy data space, using the concrete example of smart electric vehicle charging points, which will play an important role in increasing the flexibility and efficiency of the energy sector.

Key research questions to be addressed as part of the project are:

  • What property rights are included within the smart charging data?
  • What is the business case for industry players and customers to share their data?
  • What should be the overarching principles governing a European energy data space?
  • What government interventions or data standards are required to make specific use cases successful for achieving green transition goals?..(More)”.

Confronting Reality in Cyberspace: Foreign Policy for a Fragmented Internet


Report by Council on Foreign Affairs Task Force: “…The Task Force proposes three pillars to a foreign policy that should guide Washington’s adaptation to today’s more complex, variegated, and dangerous cyber realm.

First, Washington should confront reality and consolidate a coalition of allies and friends around a vision of the internet that preserves—to the greatest degree possible—a trusted, protected international communication platform.

Second, the United States should balance more targeted diplomatic and economic pressure on adversaries, as well as more disruptive cyber operations, with clear statements about self-imposed restraint on specific types of targets agreed to among U.S. allies.

Third, the United States needs to put its own proverbial house in order. That requirement calls for Washington to link more cohesively its policy for digital competition with the broader enterprise of national security strategy.

The major recommendations of the Task Force are as follows:

  • Build a digital trade agreement among trusted partners.
  • Agree to and adopt a shared policy on digital privacy that is interoperable with Europe’s General Data Protection Regulation (GDPR).
  • Resolve outstanding issues on U.S.-European Union (EU) data transfers.
  • Create an international cybercrime center.
  • Launch a focused program for cyber aid and infrastructure development.
  • Work jointly across partners to retain technology superiority.
  • Declare norms against destructive attacks on election and financial systems.
  • Negotiate with adversaries to establish limits on cyber operations directed at nuclear command, control, and communications (NC3) systems.
  • Develop coalition-wide practices for the Vulnerabilities Equities Process (VEP).
  • Adopt greater transparency about defend forward actions.
  • Hold states accountable for malicious activity emanating from their territories.
  • Make digital competition a pillar of the national security strategy.
  • Clean up U.S. cyberspace by offering incentives for internet service providers (ISPs) and cloud providers to reduce malicious activity within their infrastructure.
  • Address the domestic intelligence gap.
  • Promote the exchange of and collaboration among talent from trusted partners.
  • Develop the expertise for cyber foreign policy.

A free, global, and open internet was a worthy aspiration that helped guide U.S. policymakers for the internet’s first thirty years. The internet as it exists today, however, demands a reconsideration of U.S. cyber and foreign policies to confront these new realities. The Task Force believes that U.S. goals moving forward will be more limited and thus more attainable, but the United States needs to act quickly to design strategies and tactics that can ameliorate an urgent threat…(More)”.

Non-Fungible Tokens (NFTs)


Report by the Congressional Research Service:Non-fungible tokens (NFTs) have become popular as unique and non-interchangeable units of data that signify ownership of associated digital items, such as images, music, or videos. Token “ownership” is recorded and tracked on a blockchain (a digital database that records data on a decentralized network of computers without the use of a central authority). In the future, supporters believe NFTs will be used as digital representations of physical items, such as a deed to a house or title to a car. NFTs are commonly used to record and represent ownership of an item, verify authenticity, and enable exchange. However, they do not necessarily reflect the legal ownership of an asset or grant copyright to a digital or physical item. NFT owners purchase only the right to the NFT’s blockchain metadata or “token,” not the underlying asset, unless otherwise specified in external contracts or terms and conditions. NFTs share many similarities with cryptocurrencies, and they are commonly bought and traded using cryptocurrency. Both NFTs and cryptocurrencies are built and tracked on blockchains, and they share much of the same customer and community base. However, cryptocurrencies are fungible, meaning interchangeable, whereas NFTs are unique and therefore non-fungible. Most users create and buy NFTs on dedicated NFT marketplaces. For a typical NFT, it is created or “minted” on a blockchain, auctioned off or sold at a fixed price on an NFT marketplace, and “stored”in the buyer’s digital wallet. Smart contracts (self-executing contracts or lines of computer code on a blockchain) can mint NFTs or transfer them from one owner to another. In combination, blockchains and smart contracts are the backbone of the NFT ecosystem…

Report by the Congressional Research Service: “Non-fungible tokens (NFTs) have become popular as unique and non-interchangeable units of data that signify ownership of associated digital items, such as images, music, or videos. Token “ownership” is recorded and tracked on a blockchain (a digital database that records data on a decentralized network of computers without the use of a central authority). In the future, supporters believe NFTs will be used as digital representations of physical items, such as a deed to a house or title to a car. NFTs are commonly used to record and represent ownership of an item, verify authenticity, and enable exchange. However, they do not necessarily reflect the legal ownership of an asset or grant copyright to a digital or physical item. NFT owners purchase only the right to the NFT’s blockchain metadata or “token,” not the underlying asset, unless otherwise specified in external contracts or terms and conditions. NFTs share many similarities with cryptocurrencies, and they are commonly bought and traded using cryptocurrency. Both NFTs and cryptocurrencies are built and tracked on blockchains, and they share much of the same customer and community base. However, cryptocurrencies are fungible, meaning interchangeable, whereas NFTs are unique and therefore non-fungible. Most users create and buy NFTs on dedicated NFT marketplaces. For a typical NFT, it is created or “minted” on a blockchain, auctioned off or sold at a fixed price on an NFT marketplace, and “stored”in the buyer’s digital wallet. Smart contracts (self-executing contracts or lines of computer code on a blockchain) can mint NFTs or transfer them from one owner to another. In combination, blockchains and smart contracts are the backbone of the NFT ecosystem…

Despite substantial market growth over the past two years, NFTs are still relatively nascent. In their current form, NFTs have implications in a variety of policy areas:
– Consumer protection. There are a number of risks to consumers in the NFT ecosystem, and some NFT marketplaces and digital wallets lack basic features to protect consumers from fraud and misleading or deceptive practices.
– Financial regulation. Depending on the purpose and use of NFTs, some NFTs and NFT platforms may fall under existing financial regulatory regimes and definitions.
– Copyright and intellectual property. The relationship between NFTs and the legal ownership of digital or physical property is unclear. Some existing regulations may impact NFT markets.
– Energy and environmental. Both minting and selling NFTs require substantial amounts of energy, which has raised concerns about their environmental impact…(More)”.