State of Gender Data


Report by Data2X: “Gender data is fundamental to achieving gender equality and the Sustainable Development Goals. It helps identify inequalities, illuminate a path forward, and monitor global progress. As recognition of its importance has grown over the last decade, the availability of gender data—and its use in decision-making—has improved.

Yet overlapping crises, from the COVID-19 pandemic to climate change and conflict, have imperiled progress on gender equality and the Sustainable Development Goals. In 2022, UN Secretary General Antonio Gutierrez declared that the Sustainable Development Goals are in need of rescue. The 2022 SDG Gender Index by EM2030 found little progress on global gender equality between 2015 and 2020, and a recent assessment by UN Women demonstrates that more than one quarter of the indicators needed to measure progress on gender equality are “far or very far” from 2030 targets….The State of Gender Data is an evolving Data2X publication and digital experience designed to highlight global progress and spur action on gender data. Data2X will update the initiative annually, providing insight into a new dimension of gender data. For our initial launch, we focus on examining funding trends and highlighting promising solutions and key commitments….(More)”.

Building Trust to Reinforce Democracy


Main Findings from the 2021 OECD Survey on Drivers of Trust in Public Institutions: “What drives trust in government? This report presents the main findings of the first OECD cross-national survey on trust in government and public institutions, representing over 50 000 responses across 22 OECD countries. The survey measures government performance across five drivers of trust – reliability, responsiveness, integrity, openness, and fairness – and provides insights for future policy reforms. This investigation marks an important initiative by OECD countries to measure and better understand what drives people’s trust in public institutions – a crucial part of reinforcing democracy…(More)”.

Closing the Data Divide for a More Equitable U.S. Digital Economy


Report by Gillian Diebold: “In the United States, access to many public and private services, including those in the financial, educational, and health-care sectors, are intricately linked to data. But adequate data is not collected equitably from all Americans, creating a new challenge: the data divide, in which not everyone has enough high-quality data collected about them or their communities and therefore cannot benefit from data-driven innovation. This report provides an overview of the data divide in the United States and offers recommendations for how policymakers can address these inequalities…(More)”.

Making Government Data Publicly Available: Guidance for Agencies on Releasing Data Responsibly


Report by Hugh Grant-Chapman, and Hannah Quay-de la Vallee: “Government agencies rely on a wide range of data to effectively deliver services to the populations with which they engage. Civic-minded advocates frequently argue that the public benefits of this data can be better harnessed by making it available for public access. Recent years, however, have also seen growing recognition that the public release of government data can carry certain risks. Government agencies hoping to release data publicly should consider those potential risks in deciding which data to make publicly available and how to go about releasing it.

This guidance offers an introduction to making data publicly available while addressing privacy and ethical data use issues. It is intended for administrators at government agencies that deliver services to individuals — especially those at the state and local levels — who are interested in publicly releasing government data. This guidance focuses on challenges that may arise when releasing aggregated data derived from sensitive information, particularly individual-level data.

The report begins by highlighting key benefits and risks of making government data publicly available. Benefits include empowering members of the general public, supporting research on program efficacy, supporting the work of organizations providing adjacent services, reducing agencies’ administrative burden, and holding government agencies accountable. Potential risks include breaches of individual privacy; irresponsible uses of the data by third parties; and the possibility that the data is not used at all, resulting in wasted resources.

In light of these benefits and risks, the report presents four recommended actions for publishing government data responsibly:

  1. Establish data governance processes and roles;
  2. Engage external communities;
  3. Ensure responsible use and privacy protection; and
  4. Evaluate resource constraints.

These key considerations also take into account federal and state laws as well as emerging computational and analytical techniques for protecting privacy when releasing data, such as differential privacy techniques and synthetic data. Each of these techniques involves unique benefits and trade-offs to be considered in context of the goals of a given data release…(More)”.

Mapping community resources for disaster preparedness: humanitarian data capability and automated futures


Report by Anthony McCosker et al: “This report details the rationale, background research and design for a platform to help local communities map resources for disaster preparedness. It sets out a first step in improving community data capability through resource mapping to enhance humanitarian action before disaster events occur.The project seeks to enable local community disaster preparedness and thus build community resilience by improving the quality of data about community strengths, resources and assets.

In this report, the authors define a gap in existing humanitarian mapping approaches and the uses of open, public and social media data in humanitarian contexts. The report surveys current knowledge and present a selection of case studies delivering data and humanitarian mapping in local communities.

Drawing on this knowledge and practice review and stakeholder workshops throughout 2021, the authors also define a method and toolkit for the effective use of community assets data…(More)”

Big, Open Data for Development: A Vision for India 


Paper by Sam Asher, Aditi Bhowmick, Alison Campion, Tobias Lunt and Paul Novosad: “The government generates terabytes of data directly and incidentally in the operation of public programs. For intrinsic and instrumental reasons, these data should be made open to the public. Intrinsically, a right to government data is implicit in the right to information. Instrumentally, open government data will improve policy, increase accountability, empower citizens, create new opportunities for private firms, and lead to development and economic growth. A series of case studies demonstrates these benefits in a range of other contexts. We next examine how government can maximize social benefit from government data. This entails opening administrative data as far upstream in the data pipeline as possible. Most administrative data can be minimally aggregated to protect privacy, while providing data with high geographic granularity. We assess the status quo of the Government of India’s data production and dissemination pipeline, and find that the greatest weakness lies in the last mile: making government data accessible to the public. This means more than posting it online; we describe a set of principles for lowering the access and use costs close to zero. Finally, we examine the use of government data to guide policy in the COVID-19 pandemic. Civil society played a key role in aggregating, disseminating, and analyzing government data, providing analysis that was essential to policy response. However, key pieces of data, like testing rates and seroprevalence distribution, were unnecessarily withheld by the government, data which could have substantially improved the policy response. A more open approach to government data would have saved many lives…(More)”.

Can open-source technologies support open societies?


Report by Victoria Welborn, and George Ingram: “In the 2020 “Roadmap for Digital Cooperation,” U.N. Secretary General António Guterres highlighted digital public goods (DPGs) as a key lever in maximizing the full potential of digital technology to accelerate progress toward the Sustainable Development Goals (SDGs) while also helping overcome some of its persistent challenges. 

The Roadmap rightly pointed to the fact that, as with any new technology, there are risks around digital technologies that might be counterproductive to fostering prosperous, inclusive, and resilient societies. In fact, without intentional action by the global community, digital technologies may more naturally exacerbate exclusion and inequality by undermining trust in critical institutions, allowing consolidation of control and economic value by the powerful, and eroding social norms through breaches of privacy and disinformation campaigns. 

Just as the pandemic has served to highlight the opportunity for digital technologies to reimagine and expand the reach of government service delivery, so too has it surfaced specific risks that are hallmarks of closed societies and authoritarian states—creating new pathways to government surveillance, reinforcing existing socioeconomic inequalities, and enabling the rapid proliferation of disinformation. Why then—in the face of these real risks—focus on the role of digital public goods in development?

As the Roadmap noted, DPGs are “open source software, open data, open AI models, open standards and open content that adhere to privacy and other applicable laws and best practices, do no harm, and help attain the SDGs.”[1] There are a number of factors why such products have unique potential to accelerate development efforts, including widely recognized benefits related to more efficient and cost effective implementation of technology-enabled development programming. 

Historically, the use of digital solutions for development in low- and middle-income countries (LMICs) has been supported by donor investments in sector-specific technology systems, reinforcing existing silos and leaving countries with costly, proprietary software solutions with duplicative functionality and little interoperability across government agencies, much less underpinning private sector innovation. These silos are further codified through the development of sector-specific maturity models and metrics. An effective DPG ecosystem has the potential to enable the reuse and improvement of existing tools, thereby lowering overall cost of deploying technology solutions and increasing efficient implementation.

Beyond this proven reusability of DPGs and the associated cost and deployment efficiencies, do DPGs have even more transformational potential? Increasingly, there is interest in DPGs as drivers of inclusion and products through which to standardize and safeguard rights; these opportunities are less understood and remain unproven. To begin to fill that gap, this paper first examines the unique value proposition of DPGs in supporting open societies by advancing more equitable systems and by codifying rights. The paper then considers the persistent challenges to more fully realizing this opportunity and offers some recommendations for how to address these challenges…(More)”.

What is the value of data? A review of empirical methods


Policy brief by Diane Coyle and Annabel Manley: “The economy has been transformed by data in recent years. Data-driven firms made up seven of the global top 10 firms by stock market capitalisation in 2021; and across the OECD (Organisation for Economic Co-operation and Development) economies there has been a growing gap in terms of productivity and profitability between firms that use data intensively and the rest (e.g. Brynjolfsson et al 2019; Bajgar et al 2022; Coyle et al 2022). The widespread availability of data and analytics has also begun to extend into the public sector and policymaking, for example with ‘following the science’ – implying intense use of data – becoming a tagline for the handling of the COVID-19 pandemic in the UK and elsewhere.

It is therefore obvious that data has value in an economically meaningful sense. The sources of its value and characteristics of data as an economic asset are discussed at length in our earlier Value of Data report (Coyle et al 2020a). We concluded that there is potential value to the economy as a whole from having the ability to use data, and not just to the organisations that control specific data sets. This appreciation is increasingly reflected in many policy statements of data strategy and the broader debate about the governance of data (e.g. European Parliament 2022). The value of data is also explicitly and implicitly acknowledged by firms that sell data services, and investors who take dataset assets into account in stock market valuations or mergers and acquisitions.

However, despite the broad recognition of its value, and the need to develop appropriate policy frameworks, there is still no consensus method for empirically determining the value of data. Without this, the full potential will not be realised (Verhulst 2018). There are not even many examples of markets for data that would indicate a private valuation (although not the wider social value). Yet estimates of the value of data are needed to determine an appropriate level of investment, as well as a better understanding of how data can contribute value to the economy and how to govern the collection and use of different types of data.

This brief presents an overview of a range of alternative methods for data valuation, including those proposed in the existing literature. This includes some relatively widely used methods and others that are more specialist or preliminary…(More)”.

Localising AI for crisis response


Report by Aleks Berditchevskaia and Kathy Peach, Isabel Stewart: “Putting power back in the hands of frontline humanitarians and local communities.

This report documents the results of a year-long project to design and evaluate new proof-of-concept Collective Crisis Intelligence tools. These are tools that combine data from crisis-affected communities with the processing power of AI to improve humanitarian action.

The two collective crisis intelligence tool prototypes developed were:

  • NFRI-Predict: a tool that predicts which non-food aid items (NFRI) are most needed by different types of households in different regions of Nepal after a crisis.
  • Report and Respond: a French language SMS-based tool that allows Red Cross volunteers in Cameroon to check the accuracy of COVID-19 rumours or misinformation they hear from the community while they’re in the field, and receive real-time guidance on appropriate responses.

Both tools were developed using Nesta’s Participatory AI methods, which aimed to address some of the risks associated with humanitarian AI by involving local communities in the design, development and evaluation of the new tools.

The project was a partnership between Nesta’s Centre for Collective Intelligence Design (CCID) and Data Analytics Practice (DAP), the Nepal Red Cross and Cameroon Red Cross, IFRC Solferino Academy, and Open Lab Newcastle University, and it was funded by the UK Humanitarian Innovation Hub.

We found that collective crisis intelligence:

  • has the potential to make local humanitarian action more timely and appropriate to local needs.
  • can transform locally-generated data to drive new forms of (anticipatory) action.

We found that participatory AI:

  • can overcome several critiques and limitations of AI – as well as helping to improve model performance.
  • helps to surface tensions between the assumptions and standards set by AI gatekeepers versus the pragmatic reality of implementation.
  • creates opportunities for building and sharing new capabilities among frontline staff and data scientists.

We also validated that collective crisis intelligence and participatory AI can help increase trust in AI tools, but more research is needed to untangle the factors that were responsible…(More)”.

Protecting Children in Cyberconflicts


Paper by Eleonore Pauwels: “Just as digital technologies have transformed myriad aspects of daily life, they are now transforming war, politics and the social fabric.

This rapid analysis examines the ways in which cyberconflict adversely affects children and offers actions that could strengthen safeguards to protect them.

Cyberconflict can impact children directly or indirectly. Harms range from direct targeting for influence and recruitment into armed forces and armed groups, to personal data manipulation and theft, to cyber attacks on infrastructure across sectors critical to child well-being such as education and health facilities.

Many experts believe that the combination of existing international humanitarian law, international criminal law, human rights law, and child rights law is adequate to address the emerging issues posed by cyberconflict. Nevertheless, several key challenges persist. Attribution of cyber attacks to specific actors and ensuring accountability has proven challenging, particularly in the so-called grey zone between war and peace.

There is an urgent need to clarify how child rights apply in the digital space and for Member States to place these rights at the centre of regulatory frameworks and legislation on new technologies…(More)”.