Data Governance Toolkit: Navigating Data in the Digital Age


Toolkit by the Broadband Commission Working Group on Data Governance: “.. the Toolkit serves as a practical, capacity-building resource for policymakers, regulators, and governments. It offers actionable guidance on key data governance priorities — including legal frameworks, institutional roles, cross-border data flows, digital self-determination, and data for AI.

As a key capacity building resource, the Toolkit aims to empower policymakers, regulators and data practitioners to navigate the complexities of data governance in the digital era. Plans are currently underway to translate the Toolkit into French, Spanish, Chinese, and Arabic to ensure broader global accessibility and impact. Pilot implementation at country level is also being explored for Q4 2025 to support national-level uptake.   

The Data Governance Toolkit

The Data Governance Toolkit: Navigating Data in the Digital Age offers a practical, rights-based guide to help governments, institutions, and stakeholders make data work for all.  

The Toolkit is organized around four foundational data governance components—referred to as the 4Ps of Data Governance: 

  • Why (Purpose): How to define a vision and purpose for data governance in the context of AI, digital transformation, and sustainable development. 
  • How (Principles): What principles should guide a governance framework to balance innovation, security, and ethical considerations. 
  • Who (People and Processes): Identifying the stakeholders, institutions, and processes required to build and enforce responsible governance structures. 
  • What (Practices and Mechanisms): Policies and best practices to manage data across its entire lifecycle while ensuring privacy, interoperability, and regulatory compliance.
Data governance framework

The Toolkit also includes: 

  • A self-assessment framework to help organizations evaluate their current capabilities; 
  • A glossary of key terms to foster shared understanding;  
  • A curated list of other toolkits and frameworks for deeper engagement. 

Designed to be adaptable across regions and sectors, the Data Governance Toolkit is not a one-size-fits-all manual—but a modular resource to guide smarter, safer, and fairer data use in the digital age…(More)”

Researchers’ access to information from regulated online services


Report by Ofcom (UK): “…We outline three potential policy options and models for facilitating greater researcher access, which include:

  1. Clarify existing legal rules: Relevant authorities, could provide additional guidance on what is already legally permitted for researcher access on important issues, such as data donations and research-related scraping.
  2. Create new duties, enforced by a backstop regulator: Services could be required to put in place systems and processes to operationalise data access. This could include new duties on regulated services to create standard procedures for researcher accreditation. Services would be responsible for providing researchers with data directly or providing the interface through which they can access it and offering appeal and redress mechanisms. A backstop regulator could enforce these duties – either an existing or new body. 
  3. Enable and manage access via independent intermediary: New legal powers could be granted to a trusted third party which would facilitate and manage researchers’ access to data. This intermediary – which could again be an existing or new body – would accredit researchers and provide secure access.

Our report describes three types of intermediary that could be considered – direct access intermediary, notice to service intermediary and repository intermediary models.

  • Direct access intermediary. Researchers could request data with an intermediary facilitating secure access. In this model, services could retain responsibility for hosting and providing data while the intermediary maintains the interface by which researchers request access.
  • Notice to service intermediary. Researchers could apply for accreditation and request access to specific datasets via the intermediary. This could include data that would not be accessible in direct access models. The intermediary would review and refuse or approve access. Services would then be required to provide access to the approved data.
  • Repository intermediary. The intermediary could itself provide direct access to data, by providing an interface for data access and/or hosting the data itself and taking responsibility for data governance. This could also include data that would not be accessible in direct access models…(More)”.

How a new platform is future-proofing governance for the intelligent age


Article by Kelly Ommundsen: “We are living through one of the most transformative moments in human history. Technologies like artificial intelligence (AI), quantum computing and synthetic biology are accelerating change at a pace few institutions are prepared to manage. Yet while innovation is leaping forward, regulation often remains standing still – constrained by outdated models, fragmented approaches and a reactive mindset…

To address this growing challenge, the World Economic Forum, in collaboration with the UAE’s General Secretariat of the Cabinet, has launched the Global Regulatory Innovation Platform (GRIP).

GRIP is a new initiative designed to foster human-centred, forward-looking and globally coordinated approaches to regulation. Its goal: to build trust, reduce uncertainty and accelerate innovation that serves the public good.

This platform builds on the World Economic Forum’s broader body of work on agile governance. As outlined in the Forum’s 2020 report, Agile Governance: Reimagining Policy-making in the Fourth Industrial Revolution, traditional regulatory approaches – characterized by top-down control and infrequent updates – are increasingly unfit for the pace, scale and complexity of modern technological change…(More)”.

Artificial Democracy: The Impact of Big Data on Politics, Policy, and Polity


Book edited by Cecilia Biancalana and Eric Montigny: “Democracy and data have a complicated relationship. Under the influence of big data and artificial intelligence, some democracies are being transformed as relations between citizens, political parties, governments, and corporations are gradually redrawn.

Artificial Democracy explores the ways in which data collection and analytics and their application are changing political practices, government policies, and even democratic policies themselves. With an international roster of multidisciplinary contributors, this topical collection takes a comprehensive approach to big data’s effect on democracy, from the use of micro-targeting in electoral campaigns to the clash between privacy and surveillance in the name of protecting society.

The book tackles both the dangers and the potentially desirable changes made possible by the symbiosis of big data and artificial intelligence. It explores shifts in how we conceptualize the citizen-government relationship and asks important questions about where we could be heading…(More)”.

Sudden loss of key US satellite data could send hurricane forecasting back ‘decades’


Article by Eric Holthaus: “A critical US atmospheric data collection program will be halted by Monday, giving weather forecasters just days to prepare, according to a public notice sent this week. Scientists that the Guardian spoke with say the change could set hurricane forecasting back “decades”, just as this year’s season ramps up.

In a National Oceanic and Atmospheric Administration (Noaa) message sent on Wednesday to its scientists, the agency said that “due to recent service changes” the Defense Meteorological Satellite Program (DMSP) will “discontinue ingest, processing and distribution of all DMSP data no later than June 30, 2025”.

Due to their unique characteristics and ability to map the entire world twice a day with extremely high resolution, the three DMSP satellites are a primary source of information for scientists to monitor Arctic sea ice and hurricane development. The DMSP partners with Noaa to make weather data collected from the satellites publicly available.

The reasons for the changes, and which agency was driving them, were not immediately clear. Noaa said they would not affect the quality of forecasting.

However, the Guardian spoke with several scientists inside and outside of the US government whose work depends on the DMSP, and all said there are no other US programs that can form an adequate replacement for its data.

“We’re a bit blind now,” said Allison Wing, a hurricane researcher at Florida State University. Wing said the DMSP satellites are the only ones that let scientists see inside the clouds of developing hurricanes, giving them a critical edge in forecasting that now may be jeopardized.

“Before these types of satellites were present, there would often be situations where you’d wake up in the morning and have a big surprise about what the hurricane looked like,” said Wing. “Given increases in hurricane intensity and increasing prevalence towards rapid intensification in recent years, it’s not a good time to have less information.”..(More)”.

Unpacking OpenAI’s Amazonian Archaeology Initiative


Article by Lori Regattieri: “What if I told you that one of the most well-capitalized AI companies on the planet is asking volunteers to help them uncover “lost cities” in the Amazonia—by feeding machine learning models with open satellite data, lidar, “colonial” text and map records, and indigenous oral histories? This is the premise of the OpenAI to Z Challenge, a Kaggle-hosted hackathon framed as a platform to “push the limits” of AI through global knowledge cooperation. In practice, this is a product development experiment cloaked as public participation. The contributions of users, the mapping of biocultural data, and the modeling of ancestral landscapes all feed into the refinement of OpenAI’s proprietary systems. The task itself may appear novel. The logic is not. This is the familiar playbook of Big Tech firms—capture public knowledge, reframe it as open input, and channel it into infrastructure that serves commercial, rather than communal goals.

The “challenge” is marketed as a “digital archaeology” experiment, it invites participants from all around the world to search for “hidden” archaeological sites in the Amazonia biome (Brazil, Bolivia, Columbia, Ecuador, Guyana, Peru, Suriname, Venezuela, and French Guiana) using a curated stack of open-source data. The competition requires participants to use OpenAI’s latest GPT-4.1 and the o3/o4-mini models to parse multispectral satellite imagery, LiDAR-derived elevation maps (Light Detection and Ranging is a remote sensing technology that uses laser pulses to generate high-resolution 3D models of terrain, including areas covered by dense vegetation), historical maps, and digitized ethnographic archives. The coding teams or individuals need to geolocate “potential” archaeological sites, argue their significance using verifiable public sources, and present reproducible methodologies. Prize incentives total $400,000 USD, with a first-place award of $250,000 split between cash and OpenAI API credits.

While framed as a novel invitation to “anyone” to do archaeological research, the competition focuses mainly on the Brazilian territory, transforming the Amazonia and its peoples into an open laboratory for model testing. What is presented as scientific crowdsourcing is in fact a carefully designed mechanism for refining geospatial AI at scale. Participants supply not just labor and insight, but novel training and evaluation strategies that extend far beyond heritage science and into the commercial logics of spatial computing…(More)”.

Will AI speed up literature reviews or derail them entirely?


Article by Sam A. Reynolds: “Over the past few decades, evidence synthesis has greatly increased the effectiveness of medicine and other fields. The process of systematically combining findings from multiple studies into comprehensive reviews helps researchers and policymakers to draw insights from the global literature1. AI promises to speed up parts of the process, including searching and filtering. It could also help researchers to detect problematic papers2. But in our view, other potential uses of AI mean that many of the approaches being developed won’t be sufficient to ensure that evidence syntheses remain reliable and responsive. In fact, we are concerned that the deployment of AI to generate fake papers presents an existential crisis for the field.

What’s needed is a radically different approach — one that can respond to the updating and retracting of papers over time.

We propose a network of continually updated evidence databases, hosted by diverse institutions as ‘living’ collections. AI could be used to help build the databases. And each database would hold findings relevant to a broad theme or subject, providing a resource for an unlimited number of ultra-rapid and robust individual reviews…

Currently, the gold standard for evidence synthesis is the systematic review. These are comprehensive, rigorous, transparent and objective, and aim to include as much relevant high-quality evidence as possible. They also use the best methods available for reducing bias. In part, this is achieved by getting multiple reviewers to screen the studies; declaring whatever criteria, databases, search terms and so on are used; and detailing any conflicts of interest or potential cognitive biases…(More)”.

The End of the Age of NGOs? How Civil Society Lost Its Post–Cold War Power


Article by Sarah Bush and Jennifer Hadden: “The 1990s were a golden age for nongovernmental organizations. It was a time when well-known groups such as Amnesty International, Greenpeace, and Oxfam grew their budgets and expanded their global reach. Between 1990 and 2000, the number of international NGOs—not-for-profit groups that are largely independent from government and work in multiple countries in pursuit of the public good—increased by 42 percent. Thousands of organizations were founded. Many of these organizations championed liberal causes, such as LGBTQ rights and gun control. Conservative groups emerged, too, with rival policy agendas.

As their numbers grew, NGOs became important political players. Newly minted organizations changed state policies. The International Campaign to Ban Landmines, a coalition of NGOs formed in 1992, successfully pushed for the adoption of the Anti-Personnel Mine Ban Convention in 1997—an effort that won it the Nobel Peace Prize. Transparency International, a Berlin-based NGO established in 1993, raised the profile of corruption issues through its advocacy, building momentum toward the adoption of the UN Convention Against Corruption in 2003. Future UN Secretary-General Kofi Annan declared at the 1993 World Conference on Human Rights that “the twenty-first century will be an era of NGOs.” In an influential 1997 essay in Foreign Affairs, Jessica Mathews argued that the end of the Cold War brought with it a “power shift”: global civil society, often formalized as NGOs, was wresting authority and influence from states. More and more often, Mathews contended, NGOs were taking over responsibilities for the delivery of development and humanitarian assistance, pushing governments around during international negotiations, and setting the policy agenda on issues such as environmental protection and human rights.

Today, however, the picture looks remarkably different…(More)”.

Why Big Tech is threatened by a global push for data sovereignty


Article by Damilare Dosunmu: “A battle for data sovereignty is brewing from Africa to Asia.

Developing nations are challenging Big Tech’s decades-long hold on global data by demanding that their citizens’ information be stored locally. The move is driven by the realization that countries have been giving away their most valuable resource for tech giants to build a trillion-dollar market capitalization.

In April, Nigeria asked Google, Microsoft, and Amazon to set concrete deadlines for opening data centers in the country. Nigeria has been making this demand for about four years, but the companies have so far failed to fulfill their promises. Now, Nigeria has set up a working group with the companies to ensure that data is stored within its shores.

“We told them no more waivers — that we need a road map for when they are coming to Nigeria,” Kashifu Inuwa Abdullahi, director-general of Nigeria’s technology regulator, the National Information Technology Development Agency, told Rest of World.

Other developing countries, including India, South Africa, and Vietnam, have also implemented similar rules demanding that companies store data locally. India’s central bank requires payment companies to host financial data within the country, while Vietnam mandates that foreign telecommunications, e-commerce, and online payments providers establish local offices and keep user data within its shores for at least 24 months…(More)”.

Mapping the Unmapped


Article by Maddy Crowell: “…Most of St. Lucia, which sits at the southern end of an archipelago stretching from Trinidad and Tobago to the Bahamas, is poorly mapped. Aside from strips of sandy white beaches that hug the coastline, the island is draped with dense rainforest. A few green signs hang limp and faded from utility poles like an afterthought, identifying streets named during more than a century of dueling British and French colonial rule. One major road, Micoud Highway, runs like a vein from north to south, carting tourists from the airport to beachfront resorts. Little of this is accurately represented on Google Maps. Almost nobody uses, or has, a conventional address. Locals orient one another with landmarks: the red house on the hill, the cottage next to the church, the park across from Care Growell School.

Our van wound off Micoud Highway into an empty lot beneath the shade of a banana tree. A dog panted, belly up, under the hot November sun. The group had been recruited by the Humanitarian OpenStreetMap Team, or HOT, a nonprofit that uses an open-source data platform called OpenStreetMap to create a map of the world that resembles Google’s with one key exception: Anyone can edit it, making it a sort of Wikipedia for cartographers.

The organization has an ambitious goal: Map the world’s unmapped places to help relief workers reach people when the next hurricanefire, or other crisis strikes. Since its founding in 2010, some 340,000 volunteers around the world have been remotely editing OpenStreetMap to better represent the Caribbean, Southeast Asia, parts of Africa and other regions prone to natural disasters or humanitarian emergencies. In that time, they have mapped more than 2.1 million miles of roads and 156 million buildings. They use aerial imagery captured by drones, aircraft, or satellites to help trace unmarked roads, waterways, buildings, and critical infrastructure. Once this digital chart is more clearly defined, field-mapping expeditions like the one we were taking add the names of every road, house, church, or business represented by gray silhouettes on their paper maps. The effort fine-tunes the places that bigger players like Google Maps get wrong — or don’t get at all…(More)”