Stefaan Verhulst
Book by Rahul Bhargava: “…new toolkit for data storytelling in community settings, one purpose-built for goals like inclusion, empowerment, and impact. Data science and visualization has spread into new domains it was designed for – community organizing, education, journalism, civic governance, and more. The dominant computational methods and processes, which have not changed in response, are causing significant discriminatory and harmful impacts, documented by leading scholars across a variety of populations. Informed by 15 years of collaborations in academic and professional settings with nonprofits and marginalized populations, the book articulates a new approach for aligning the processes and media of data work with social good outcomes, learning from the practices of newspapers, museums, community groups, artists, and libraries.
This book introduces a community-driven framework as a response to the urgent need to realign data theories and methods around justice and empowerment to avoid further replicating harmful power dynamics and ensure everyone has a seat at the table in data-centered community processes. It offers a broader toolbox for working with data and presenting it, pushing beyond the limited vocabulary of surveys, spreadsheets, charts and graphs…(More)”.
Paper by Uri Y. Hacohen: “Data is often heralded as “the world’s most valuable resource,” yet its potential to benefit society remains unrealized due to systemic barriers in both public and private sectors. While open data-defined as data that is available, accessible, and usable-holds immense promise to advance open science, innovation, economic growth, and democratic values, its utilization is hindered by legal, technical, and organizational challenges. Public sector initiatives, such as U.S. and European Union open data regulations, face uneven enforcement and regulatory complexity, disproportionately affecting under-resourced stakeholders such as researchers. In the private sector, companies prioritize commercial interests and user privacy, often obstructing data openness through restrictive policies and technological barriers. This article proposes an innovative, four-layered policy framework to overcome these obstacles and foster data openness. The framework includes (1) improving open data infrastructures, (2) ensuring legal frameworks for open data, (3) incentivizing voluntary data sharing, and (4) imposing mandatory data sharing obligations. Each policy cluster is tailored to address sector-specific challenges and balance competing values such as privacy, property, and national security. Drawing from academic research and international case studies, the framework provides actionable solutions to transition from a siloed, proprietary data ecosystem to one that maximizes societal value. This comprehensive approach aims to reimagine data governance and unlock the transformative potential of open data…(More)”.
Article by Emily Badger and Sheera Frenkel: “The federal government knows your mother’s maiden name and your bank account number. The student debt you hold. Your disability status. The company that employs you and the wages you earn there. And that’s just a start. It may also know your …and at least 263 more categories of data.These intimate details about the personal lives of people who live in the United States are held in disconnected data systems across the federal government — some at the Treasury, some at the Social Security Administration and some at the Department of Education, among other agencies.
The Trump administration is now trying to connect the dots of that disparate information. Last month, President Trump signed an executive order calling for the “consolidation” of these segregated records, raising the prospect of creating a kind of data trove about Americans that the government has never had before, and that members of the president’s own party have historically opposed.
The effort is being driven by Elon Musk, the world’s richest man, and his lieutenants with the Department of Government Efficiency, who have sought access to dozens of databases as they have swept through agencies across the federal government. Along the way, they have elbowed past the objections of career staff, data security protocols, national security experts and legal privacy protections…(More)”.

Essay by Brian J. A. Boyd: “…How could stewardship of artificially living AI be pursued on a broader, even global, level? Here, the concept of “integral ecology” is helpful. Pope Francis uses the phrase to highlight the ways in which everything is connected, both through the web of life and in that social, political, and environmental challenges cannot be solved in isolation. The immediate need for stewardship over AI is to ensure that its demands for power and industrial production are addressed in a way that benefits those most in need, rather than de-prioritizing them further. For example, the energy requirements to develop tomorrow’s AI should spur research into small modular nuclear reactors and updated distribution systems, making energy abundant rather than causing regressive harms by driving up prices on an already overtaxed grid. More broadly, we will need to find the right institutional arrangements and incentive structures to make AI Amistics possible.
We are having a painfully overdue conversation about the nature and purpose of social media, and tech whistleblowers like Tristan Harris have offered grave warnings about how the “race to the bottom of the brain stem” is underway in AI as well. The AI equivalent of the addictive “infinite scroll” design feature of social media will likely be engagement with simulated friends — but we need not resign ourselves to it becoming part of our lives as did social media. And as there are proposals to switch from privately held Big Data to a public Data Commons, so perhaps could there be space for AI that is governed not for maximizing profit but for being sustainable as a common-pool resource, with applications and protocols ordered toward long-run benefit as defined by local communities…(More)”.
Book by Andy J. Merolla and Jeffrey A. Hall: “We spend much of our waking lives communicating with others. How does each moment of interaction shape not only our relationships but also our worldviews? And how can we create moments of connection that improve our health and well-being, particularly in a world in which people are feeling increasingly isolated?
Drawing from their extensive research, Andy J. Merolla and Jeffrey A. Hall establish a new way to think about our relational life: as existing within “social biomes”—complex ecosystems of moments of interaction with others. Each interaction we have, no matter how unimportant or mundane it might seem, is a building block of our identities and beliefs. Consequently, the choices we make about how we interact and who we interact with—and whether we interact at all—matter more than we might know. Merolla and Hall offer a sympathetic, practical guide to our vital yet complicated social lives and propose realistic ways to embrace and enhance connection and hope…(More)”.
Article by UNDP: “Increasingly AI techniques like natural language processing, machine learning and predictive analytics are being used alongside the most common methods in collective intelligence, from citizen science and crowdsourcing to digital democracy platforms.
At its best, AI can be used to augment and scale the intelligence of groups. In this section we describe the potential offered by these new combinations of human and machine intelligence. First we look at the applications that are most common, where AI is being used to enhance efficiency and categorize unstructured data, before turning to the emerging role of AI – where it helps us to better understand complex systems.
These are the three main ways AI and collective intelligence are currently being used together for the SDGs:
1. Efficiency and scale of data processing
AI is being effectively incorporated into collective intelligence projects where timing is paramount and a key insight is buried deep within large volumes of unstructured data. This combination of AI and collective intelligence is most useful when decision makers require an early warning to help them manage risks and distribute public resources more effectively. For example, Dataminr’s First Alert system uses pre-trained machine learning models to sift through text and images scraped from the internet, as well as other data streams, such as audio broadcasts, to isolate early signals that anticipate emergency events…(More)”. (See also: Where and when AI and CI meet: exploring the intersection of artificial and collective intelligence towards the goal of innovating how we govern).
Article by Freedom House: “From Pakistan to Zambia, governments around the world are increasingly proposing and passing data localization legislation. These laws, which refer to the rules governing the storage and transfer of electronic data across jurisdictions, are often justified as addressing concerns such as user privacy, cybersecurity, national security, and monopolistic market practices. Notwithstanding these laudable goals, data localization initiatives cause more harm than good, especially in legal environments with poor rule of law.
Data localization requirements can take many different forms. A government may require all companies collecting and processing certain types of data about local users to store the data on servers located in the country. Authorities may also restrict the foreign transfer of certain types of data or allow it only under narrow circumstances, such as after obtaining the explicit consent of users, receiving a license or permit from a public authority, or conducting a privacy assessment of the country to which the data will be transferred.
While data localization can have significant economic and security implications, the focus of this piece—inline with that of the Global Network Initiative and Freedom House—is on its potential human rights impacts, which are varied. Freedom House’s research shows that the rise in data localization policies worldwide is contributing to the global decline of internet freedom. Without robust transparency and accountability frameworks embedded into these provisions, digital rights are often put on the line. As these types of legislation continue to pop up globally, the need for rights-respecting solutions and norms for cross-border data flows is greater than ever…(More)”.
Article by Mohamed Ibrahim: “Artificial intelligence (AI) is beginning to transform many industries, yet its use to improve public services remains limited globally. AI-based tools could streamline access to government benefits through online chatbots or automate systems by which citizens report problems such as potholes.
Currently, scholarly advances in AI are mostly confined to academic papers and conferences, rarely translating into actionable government policies or products. This means that the expertise at universities is not used to solve real-world problems. As a No10 Innovation Fellow with the UK government and a lecturer in spatial data science, I have explored the potential of AI-driven rapid prototyping in public policy.
Take Street.AI, a prototype smartphone app that I developed, which lets citizens report issues including potholes, street violence or illegal litter dumping by simply taking a picture through the app. The AI model classifies the problem automatically and alerts the relevant local authority, passing on the location and details of the issue. A key feature of the app is its on-device processing, which ensures privacy and reduces operational costs. Similar tools were tested as an early-warning system during the riots that swept the United Kingdom in July and August 2024.
AI models can also aid complex decision-making — for instance, that involved in determining where to build houses. The UK government plans to construct 1.5 million homes in the next 5 years, but planning laws require that several parameters be considered — such as proximity to schools, noise levels, the neighbourhoods’ built-up ratio and flood risk. The current strategy is to compile voluminous academic reports on viable locations, but an online dashboard powered by AI that can optimize across parameters would be much more useful to policymakers…(More)”.
Paper by Francesca Di Giuseppe, Joe McNorton, Anna Lombardi & Fredrik Wetterhall: “Recent advancements in machine learning (ML) have expanded the potential use across scientific applications, including weather and hazard forecasting. The ability of these methods to extract information from diverse and novel data types enables the transition from forecasting fire weather, to predicting actual fire activity. In this study we demonstrate that this shift is feasible also within an operational context. Traditional methods of fire forecasts tend to over predict high fire danger, particularly in fuel limited biomes, often resulting in false alarms. By using data on fuel characteristics, ignitions and observed fire activity, data-driven predictions reduce the false-alarm rate of high-danger forecasts, enhancing their accuracy. This is made possible by high quality global datasets of fuel evolution and fire detection. We find that the quality of input data is more important when improving forecasts than the complexity of the ML architecture. While the focus on ML advancements is often justified, our findings highlight the importance of investing in high-quality data and, where necessary create it through physical models. Neglecting this aspect would undermine the potential gains from ML-based approaches, emphasizing that data quality is essential to achieve meaningful progress in fire activity forecasting…(More)”.
Article by Nii Simmonds: “The digital era offers remarkable prospects for both economic advancement and social development. Yet for emerging economies lacking energy, this potential often seems out of reach. The harsh truths of inconsistent electricity supply and scarce resources looms large over their digital ambitions. Nevertheless, a ray of hope shines through a strategy I call shared digital infrastructure (SDI). This cooperative model has the ability to turn these obstacles into opportunities for growth. By collaborating through regional country partnerships and bodies such as the Association of Southeast Asian Nations (ASEAN), the African Union (AU) and the Caribbean Community (CARICOM), these countries can harness the revolutionary power of digital technology, despite the challenges.
The digital economy is a critical driver of global GDP, with innovations in artificial intelligence, e-commerce and financial technology transforming industries at an unprecedented pace. At the heart of this transformation are data centres, which serve as the backbone of digital services, cloud computing and AI-driven applications. Yet many developing nations struggle to establish and maintain such facilities due to high energy costs, inadequate grid reliability and limited investment capital…(More)”.