Explore our articles
View All Results

Stefaan Verhulst

Article by Shannon DosemagenGwen Ottinger: “…maintaining healthy communities requires information. To respond effectively to accidents and releases, residents and emergency responders need real-time measurements of hazardous chemicals in the air. Residents who observe soot or dust blanketing their communities need information about its chemical composition to share with their health providers. Closing the refinery in Benicia requires still more information, so the city can understand the levels of toxins left in the soil and the risks of further exposures from clean-up processes. Faced with the choice between the razing of an oil refinery and its conversion to a renewables facility, communities should be able to compare the status quo with expected emissions and safety risks for multiple future scenarios.

Creating the kind of knowledge base necessary for such consequential decisions would require long-term coordination across the many communities affected by energy infrastructure. Places like Benicia, Martinez, and Rodeo would need a place to store data about pollution before, during, and after major changes at nearby energy facilities. They would need to have a way of sharing their data and analyses with other similarly situated communities if they chose to do so, and they would need to be able to access data and analyses from other communities just as easily. Academic and nonprofit researchers with a bird’s eye view of the issues could also enhance knowledge infrastructures if they had access to data shared by communities and a way not only to disseminate their findings, but to share their methodologies for communities to adapt and deploy.

Existing data infrastructures can’t support this kind of collective learning about environmental issues. Both the technical and governance aspects of the infrastructure would need significant upgrades, and the customary models for funding science in the United States don’t offer the kinds of investment that would be necessary. Funding is typically structured around short grant cycles and discrete deliverables, making it difficult to support the long-term, shared stewardship that this infrastructure requires. Addressing these hurdles could enable creation of a robust environmental knowledge commons maintained by a plethora of users and contributors. Such a commons could ensure the continued capacity to generate new insights about the impacts of pollution and environmental change, forming a durable basis for evidence-informed public policy, whether or not the federal government continues to support environmental science. An environmental knowledge commons could, moreover, offer a model for ongoing advancement in other fields of science where traditional funding models have become precarious, even as their knowledge remains essential to public well-being…(More)”.

Constructing a New Knowledge Infrastructure

Report by Bronwyn Carlson and Tamika Worrell: “Artificial intelligence is increasingly embedded in everyday life in Australia, shaping communication, services, and relationships. This report presents findings from the Relational Futures project, an Indigenous led study examining how Aboriginal and Torres Strait Islander peoples are encountering and responding to AI, including generative systems, automated decision-making tools, and AI companions. The research draws on a mixed methods approach combining an online survey with 36 respondents and yarning circles with 22 participants, providing both broad and in-depth insight into Indigenous experiences of AI across community and professional contexts. This report presents the initial findings as the project continues…(More)”.

Relational futures: Indigenous sovereignty and the governance of artificial intelligence (AI)

Book edited by Daryl Lim and Peter K Yu: “As artificial intelligence and big data analytics reshape economies and societies, the promise of innovation is increasingly shadowed by concerns over inclusion, equity, and global justice. This accessible, interdisciplinary volume brings together established and emerging voices from across the world to critically examine issues lying at the intersection of innovation, intellectual property, and inequality in the age of artificial intelligence and big data. Featuring empirical studies, legal analyses, policy critiques, interdisciplinary perspectives, and global insights, Inclusive Innovation in the Age of AI and Big Data underscores the tremendous impact gender, race, and other socioeconomic factors have on innovation and intellectual property ecosystems. This volume also explores structural barriers in these ecosystems, diversity initiatives in the patent area, metrics for measuring inclusivity and diversity in innovation, changes brought about by artificial intelligence and big data, and the evolution of the global innovation and intellectual property systems. In an era marked by rapid technological change, extraordinary opportunities, and deepening inequality, this volume offers carefully designed reform strategies and policy recommendations to make innovation and intellectual property ecosystems more equitable, effective, and socially responsive…(More)”.

Inclusive Innovation in the Age of AI and Big Data

Book by Carissa Véliz: “For thousands of years, oracles, seers, and astrologers advised leaders and commoners alike about the future. But predictions are often power plays in disguise, obfuscating accountability and stripping individuals of their agency. Today we face the same threat of powerful prophets but under a new facade: tech.

Not only do modern predictions made by tech companies advise on war, industry, and marriages, but artificial intelligence also now determines whether we can get a loan, a job, an apartment, or an organ transplant. And when we cede ground to these predictions, we lose control of our own lives.

Drawing on history’s cautionary tales and modern-day tech companies’ malfeasance—from surveillance and biased algorithms to a startling lack of accountability—Carissa Véliz demonstrates that big tech’s prophecies are just as shallow, dangerous, and unjust as their ancient counterparts’. What she uncovers in the process is chilling. Artificial intelligence is increasing risk in business and society while creating a false sense of security. In this incisive, witty, and bracingly original book, Véliz contends that the main promise of prediction is not knowledge of the future but domination over others. Powerful people use predictions to determine our future. Prophecy is an invitation to defy those orders and live life on our own terms…(More)”.

Prophecy: Prediction, Power, and the Fight for the Future, from Ancient Oracles to AI

Book edited by Patrick Dunleavy and Timothy Monteath: “Open science is a set of principles and practices that aims to make research from all fields accessible to everyone for the benefit of researchers and society as a whole. Doing Open Social Science: A Guide for Researchers is the first comprehensive book setting out the principles and practices of open research, tailored specifically for those in the social science disciplines, at every career stage, offering practical advice on how to make research more transparent, trustworthy and reusable.

Divided into four parts, the book explores the core principles and philosophy of open social science. Part II addresses how to improve the reproducibility of research through open approaches, including chapters on the principles and tools of documenting research as you go and on open data practices. Part III focuses on open practices within the qualitative social sciences. Chapters examine interview-based research, case studies and fieldwork, systematic documentation analysis, archival data and the role of openness in citizen (social) science. Part IV addresses shifting research cultures, with chapters on strategies for presenting research clearly and accessibly to maximise reach and impact and on open access publishing. The book ends with a discussion of the future of open social science. Ultimately, it argues, openness as a wider cultural change can renew the social sciences and the core foundations for academic progress in more dynamic and sustainable ways…(More)”.

Doing Open Social Science

Paper by Nicole Czaplicki, et al: “As part of the comprehensive Construction Re-engineering Initiative at the U.S. Census Bureau, alternative data sources are being considered to supplement or replace current data collection methods. For the Survey of Construction (SOC), which measures new residential construction, this includes observing housing starts from satellite imagery in place of the current interviews for housing starts conducted by field representatives. Satellite images are obtained monthly for a subset of places in the SOC sample. Convolutional neural network models are then applied to images to predict likely new residential construction projects, with the current focus being single-family housing starts. Several post prediction processing steps are applied including exclusions based on intersections with known buildings or roads, treatments for missing data due to cloud cover, and adjustments for the length of time between consecutive images, to ultimately produce place level estimates of housing starts. These place level estimates are then combined with the existing building permit level survey data to produce estimates of West South Central division level housing starts, an experimental data product from the Census Bureau…(More)”.

A Blended Data Approach to Measuring Monthly Housing Starts: Satellite Imagery, Survey Data and More!

Report by Vinith Annam and Isaac Yoder: “This report is an advocacy tool for chief data officers (CDOs) looking to expand understanding of the CDO role and the conditions that contribute to its success. It identifies six archetypes of state-level CDO offices—clarifying common variations, the conditions that cause them, and possibilities that CDOs might aspire to.  

CDO offices have expanded across the country, yet no common model defines how they are structured, resourced, or positioned. Drawing on comparative survey data developed in partnership with the National Association of State Chief Information Officers (NASCIO) and self-reported data from the State Chief Data Officer (CDO) Tracker, this report examines how state CDO offices, and equivalent state data offices, are designed and operate across states, and the implications.

Key takeaways:

  • There is no “optimal” model. An office’s archetype reflects tradeoffs among competing institutional priorities. It is dynamic, not linear, and offices blend characteristics as goals and maturity evolve. 
  • Funding constraints are persistent across all maturity levels. Inadequate funding is a consistently cited challenge for the majority of states, indicating that resource pressures are structural.
  • Reporting structure shapes strategic orientation. State CDOs remain predominantly aligned with IT leadership. While this enables execution of technical data initiatives at scale, it can limit a CDO’s ability to shape data strategy, governance, and policy.
  • Some challenges and priorities evolve alongside data maturity, while others—particularly data quality and cross-agency data sharing—persist.
  • Building strategic relationships and trust is essential. Strong partnerships with top administration officials and IT leadership are foundational for successfully implementing enterprise-level data strategies.

This report addresses the analytical gap in how offices with similar aspirations function so differently in practice. In doing so, it offers a tool for data leaders looking to increase their office’s funding and authority through strategic conversations with decision-makers and data management stakeholders…(More)”.

State Chief Data Officer Archetypes: The Evolving Roles and Capabilities of CDO Offices

Article by Christopher Graziul and Cheryl M. Danton: “In March 2025, the European Union published the European Health Data Space (EHDS) regulation, creating a legal framework that will make the electronic health records of roughly 450 million residents available for secondary use by March 2029, including commercial product development, pharmaceutical research, and AI training (Regulation (EU) 2025/327, 2025). The system defaults to inclusion: citizens must opt out, and, currently, the opt-out is all-or-nothing, making no distinction between academic research and commercial pharmaceutical development. Seventeen leading scholars have warned that the framework risks enabling corporations to extract value from population health data without equitable benefit-sharing, producing a system where citizens bear both the data burden and the cost of products developed from it (Marelli et al., 2023). That is, the EHDS does not merely regulate existing sensitive open data. Rather, it creates a new category where governments convert private health records into commercially accessible information through legislative mandate.

This is the commodification of sensitive open data in real time. In a previous article, we addressed the governance challenge of sensitive open data: how to balance transparency and protection for personal data in public records like police radio transmissions and public health records (Danton & Graziul, 2026). This piece asks a different question: whose economic interests does inadequate governance serve? The answer, from Washington to Brussels to New Delhi, involves a global data brokerage industry that treats public records and government-collected personal data as raw material for commercial extraction (Grand View Research, n.d…(More)”.

The Commodification of Sensitive Open Data

Article by Mahvish Shaukat et al: “Many governments and policymakers rely on policy-advising organisations – international development banks, think tanks, ministries – to translate academic research into actionable recommendations. Yet better evidence does not automatically produce better policy. Even when high-quality research exists, it must travel through layers of hierarchy inside a policy-advising organisation, both upward and downward. A junior analyst may surface a finding that never reaches the decision-maker who could act on it. Equally, a senior leader’s review of the evidence may never filter down to the operational level. Each step in the chain is a potential bottleneck; accordingly, the evidence-to-policy pipeline increasingly impedes the use of rigorous research in practice (DellaVigna et al. 2024, Garcia-Hombrados et al. 2025, Bonargent 2024, Rao 2024).

A growing evidence base examines how policymakers engage with evidence (Vivalt and Coville 2023, Toma and Bell 2024), and how training can build capacity for evidence use (Crowley et al. 2021, Mehmood et al. 2024), but much less is known about what drives evidence diffusion within organisations. Who shares evidence with whom? Does it depend on where in the hierarchy evidence first lands? Do concerns about how peers might react shape whether sharing happens? These are the questions we set out to answer…(More)”.

How does evidence diffuse through organisations?

Report by Reema Patel: “Our collective thinking about data governance is shaped by unconscious beliefs about the world. These are sometimes described as mental models. Our mental models shape our sense of what problems are noticed and what solutions to these problems are feasible and possible. They can sometimes limit our understanding of important issues such as how data can be governed and managed. Our current mental models about data are failing. The ongoing data trust deficit, public concern about data governance approaches, poor data quality, datasets with systemic bias and inequality that shape artificial intelligence, and repeated data governance systems failures. These all point to the need to dramatically reshape the way we think about data governance…


This report maps out ten different mental models of data governance. These are: data colonialism, data ownership, data control, data technocracy, data liberation, data protection, data justice, data sovereignty, data culture, and data stewardship. Understanding how we think about data governance as our mental models, I argue, is an essential first step towards moving beyond current approaches to realising a just and viable data governance future…(More)”.

Living well with data: stewardship as a just and viable paradigm

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday