Unlock Your City’s Hidden Solutions


Article by Andreas Pawelke, Basma Albanna and Damiano Cerrone: “Cities around the world face urgent challenges — from climate change impacts to rapid urbanization and infrastructure strain. Municipal leaders struggle with limited budgets, competing priorities, and pressure to show quick results, making traditional approaches to urban transformation increasingly difficult to implement.

Every city, however, has hidden success stories — neighborhoods, initiatives, or communities that are achieving remarkable results despite facing similar challenges as their peers.

These “positive deviants” often remain unrecognized and underutilized, yet they contain the seeds of solutions that are already adapted to local contexts and constraints.

Data-Powered Positive Deviance (DPPD) combines urban data, advanced analytics, and community engagement to systematically uncover these bright spots and amplify their impact. This new approach offers a pathway to urban transformation that is not only evidence-based but also cost-effective and deeply rooted in local realities.

DPPD is particularly valuable in resource-constrained environments, where expensive external solutions often fail to take hold. By starting with what’s already working, cities can make strategic investments that build on existing strengths rather than starting from scratch. Leveraging AI tools that improve community engagement, the approach becomes even more powerful — enabling cities to envision potential futures, and engage citizens in meaningful co-creation…(More)”

The Next Wave of Innovation Districts


Article by Bruce Katz and Julie Wagner: “A next wave of innovation districts is gaining momentum given the structural changes underway in the global economy. The examples cited above telegraph where existing innovation districts are headed and explain why new districts are forming. The districts highlighted and many others are responding to fast-changing and highly volatile macro forces and the need to de-riskdecarbonize, and diversify talent.

The next wave of innovation districts is distinctive for multiple reasons.

  • The sectors leveraging this innovation geography expand way beyond the traditional focus on life sciences to include advanced manufacturing for military and civilian purposes.
  • The deeper emphasis on decarbonization is driving the use of basic and applied R&D to invent new clean technology products and solutions as well as organizing energy generation and distribution within the districts themselves to meet crucial carbon targets.
  • The stronger emphasis on the diversification of talent includes the upskilling of workers for new production activities and a broader set of systems to drive inclusive innovation to address long-standing inequities.
  • The districts are attracting a broader group of stakeholders, including manufacturing companies, utilities, university industrial design and engineering departments and hard tech startups.
  • The districts ultimately are looking to engage a wider base of investors given the disparate resources and traditions of capitalization that support defense tech, clean tech, med tech and other favored forms of innovation.

Some regions or states are also seeking ways to connect a constellation of districts and other economic hubs to harness the imperative to innovate accentuated by these and other macro forces. The state of South Australia is one such example. It has prioritized several innovation hubs across this region to foster South Australia’s knowledge and innovation ecosystem, as well as identify emerging economic clusters in industry sectors of global competitiveness to advance the broader economy…(More)”.

The EU’s AI Power Play: Between Deregulation and Innovation


Article by Raluca Csernatoni: “From the outset, the European Union (EU) has positioned itself as a trailblazer in AI governance with the world’s first comprehensive legal framework for AI systems in use, the AI Act. The EU’s approach to governing artificial intelligence (AI) has been characterized by a strong precautionary and ethics-driven philosophy. This ambitious regulation reflects the EU’s long-standing approach of prioritizing high ethical standards and fundamental rights in tech and digital policies—a strategy of fostering both excellence and trust in human-centric AI models. Yet, framed as essential to keep pace with U.S. and Chinese AI giants, the EU has recently taken a deregulatory turn that risks trading away democratic safeguards, without addressing systemic challenges to AI innovation.

The EU now stands at a crossroads: it can forge ahead with bold, home-grown AI innovation underpinned by robust regulation, or it can loosen its ethical guardrails, only to find itself stripped of both technological autonomy and regulatory sway. While Brussels’s recent deregulatory turn is framed as a much needed competitiveness boost, the real obstacles to Europe’s digital renaissance lie elsewhere: persistent underfunding, siloed markets, and reliance on non-EU infrastructures…(More)”

Federated learning for children’s data


Article by Roy Saurabh: “Across the world, governments are prioritizing the protection of citizens’ data – especially that of children. New laws, dedicated data protection authorities, and digital infrastructure initiatives reflect a growing recognition that data is not just an asset, but a foundation for public trust. 

Yet a major challenge remains: how can governments use sensitive data to improve outcomes – such as in education – without undermining the very privacy protections they are committed to uphold?

One promising answer lies in federated, governance-aware approaches to data use. But realizing this potential requires more than new technology; it demands robust data governance frameworks designed from the outset.

Data governance: The missing link

In many countries, ministries of education, health, and social protection each hold pieces of the puzzle that together could provide a more complete picture of children’s learning and well-being. For example, a child’s school attendance, nutritional status, and family circumstances all shape their ability to thrive, yet these records are kept in separate systems.

Efforts to combine such data often run into legal and technical barriers. Centralized data lakes raise concerns about consent, security, and compliance with privacy laws. In fact, many international standards stress the principle of data minimization – the idea that personal information should not be gathered or combined unnecessarily. 

“In many countries, ministries of education, health, and social protection each hold pieces of the puzzle that together could provide a more complete picture of children’s learning and well-being.”

This is where the right data governance frameworks become essential. Effective governance defines clear rules about how data can be accessed, shared, and used – specifying who has the authority, what purposes are permitted, and how rights are protected. These frameworks make it possible to collaborate with data responsibly, especially when it comes to children…(More)”

How to Break Down Silos and Collaborate Across Government


Blog by Jessica MacLeod: “…To help public sector leaders navigate these cultural barriers, I use a simple but powerful framework: Clarity, Care, and Challenge. It’s built from research, experience, and what I’ve seen actually shift how teams work. You can read more about the framework in my previous article on high-performing teams. Here’s how this framework relates to breaking down silos:

  • Clarity → How We Work:
    Clear priorities, aligned expectations, and a shared understanding of how individual work connects to the bigger picture.
  • Care → How We Relate:
    Trust, psychological safety, and strong collaboration.
  • Challenge → How We Achieve:
    Stretch goals, high standards, and a culture that encourages innovation and growth.

Silos thrive in ambiguity. If no one can see the work, understand the language, or map who owns what, collaboration dies on arrival.

When I work with public sector teams, one of the first things I look for is how visible the work is. Can people across departments explain where things stand on a project today? Or what the context is behind a project? Do they know who’s accountable? Can they locate the latest draft of the work without digging through three email chains?

Often, the answer is no, and it’s not because people aren’t trying. It’s because our systems are optimized for siloed visibility, not shared clarity.

Here’s what that looks like in practice:

  • A particular acronym means one thing to IT, another to leadership, and something entirely different to community stakeholders.
  • “Launch” for one team means public announcement. For another, it means testing a feature with a pilot group.
  • Documents live in private folders, on individual desktops, or in tools that don’t talk to each other…(More)”.

The Importance of Co-Designing Questions: 10 Lessons from Inquiry-Driven Grantmaking


Article by Hannah Chafetz and Stefaan Verhulst: “How can a question-based approach to philanthropy enable better learning and deeper evaluation across both sides of the partnership and help make progress towards long-term systemic change? That’s what Siegel Family Endowment (Siegel), a family foundation based in New York City, sought to answer by creating an Inquiry-Driven Grantmaking approach

While many philanthropies continue to follow traditional practices that focus on achieving a set of strategic objectives, Siegel employs an inquiry-driven approach, which focuses on answering questions that can accelerate insights and iteration across the systems they seek to change. By framing their goal as “learning” rather than an “outcome” or “metric,” they aim to generate knowledge that can be shared across the whole field and unlock impact beyond the work on individual grants. 

The Siegel approach centers on co-designing and iteratively refining questions with grantees to address evolving strategic priorities, using rapid iteration and stakeholder engagement to generate insights that inform both grantee efforts and the foundation’s decision-making.

Their approach was piloted in 2020, and refined and operationalized the years that followed. As of 2024, it was applied across the vast majority of their grantmaking portfolio. Laura Maher, Chief of Staff and Director of External Engagement at Siegel Family Endowment, notes: “Before our Inquiry-Driven Grantmaking approach we spent roughly 90% of our time on the grant writing process and 10% checking in with grantees, and now that’s balancing out more.”

Screenshot 2025 05 08 at 4.29.24 Pm

Image of the Inquiry-Driven Grantmaking Process from the Siegel Family Endowment

Earlier this year, the DATA4Philanthropy team conducted two in-depth discussions with Siegel’s Knowledge and Impact team to discuss their Inquiry-Driven Grantmaking approach and what they learned thus far from applying their new methodology. While the Siegel team notes that there is still much to be learned, there are several takeaways that can be applied to others looking to initiate a questions-led approach. 

Below we provide 10 emerging lessons from these discussions…(More)”.

Glorious RAGs : A Safer Path to Using AI in the Social Sector


Blog by Jim Fruchterman: “Social sector leaders ask me all the time for advice on using AI. As someone who started for-profit machine learning (AI) companies in the 1980s, but then pivoted to running nonprofit social enterprises, I’m often the first person from Silicon Valley that many nonprofit leaders have met. I joke that my role is often that of “anti-consultant,” talking leaders out of doing an app, a blockchain (smile) or firing half their staff because of AI. Recently, much of my role has been tamping down the excessive expectations being bandied about for the impact of AI on organizations. However, two years into the latest AI fad wave created by ChatGPT and its LLM (large language model) peers, more and more of the leaders are describing eminently sensible applications of LLMs to their programs. The most frequent of these approaches can be described as variations on “Retrieval-Augmented Generation,” also known as RAG. I am quite enthusiastic about using RAG for social impact, because it addresses a real need and supplies guardrails for using LLMs effectively…(More)”

Data Commons: The Missing Infrastructure for Public Interest Artificial Intelligence


Article by Stefaan Verhulst, Burton Davis and Andrew Schroeder: “Artificial intelligence is celebrated as the defining technology of our time. From ChatGPT to Copilot and beyond, generative AI systems are reshaping how we work, learn, and govern. But behind the headline-grabbing breakthroughs lies a fundamental problem: The data these systems depend on to produce useful results that serve the public interest is increasingly out of reach.

Without access to diverse, high-quality datasets, AI models risk reinforcing bias, deepening inequality, and returning less accurate, more imprecise results. Yet, access to data remains fragmented, siloed, and increasingly enclosed. What was once open—government records, scientific research, public media—is now locked away by proprietary terms, outdated policies, or simple neglect. We are entering a data winter just as AI’s influence over public life is heating up.

This isn’t just a technical glitch. It’s a structural failure. What we urgently need is new infrastructure: data commons.

A data commons is a shared pool of data resources—responsibly governed, managed using participatory approaches, and made available for reuse in the public interest. Done correctly, commons can ensure that communities and other networks have a say in how their data is used, that public interest organizations can access the data they need, and that the benefits of AI can be applied to meet societal challenges.

Commons offer a practical response to the paradox of data scarcity amid abundance. By pooling datasets across organizations—governments, universities, libraries, and more—they match data supply with real-world demand, making it easier to build AI that responds to public needs.

We’re already seeing early signs of what this future might look like. Projects like Common Corpus, MLCommons, and Harvard’s Institutional Data Initiative show how diverse institutions can collaborate to make data both accessible and accountable. These initiatives emphasize open standards, participatory governance, and responsible reuse. They challenge the idea that data must be either locked up or left unprotected, offering a third way rooted in shared value and public purpose.

But the pace of progress isn’t matching the urgency of the moment. While policymakers debate AI regulation, they often ignore the infrastructure that makes public interest applications possible in the first place. Without better access to high-quality, responsibly governed data, AI for the common good will remain more aspiration than reality.

That’s why we’re launching The New Commons Challenge—a call to action for universities, libraries, civil society, and technologists to build data ecosystems that fuel public-interest AI…(More)”.

Entering the Vortex


Essay by Nils Gilman: “A strange and unsettling weather pattern is forming over the landscape of scholarly research. For decades, the climate of academic inquiry was shaped by a prevailing high-pressure system, a consensus grounded in the vision articulated by Vannevar Bush in “Science: The Endless Frontier” (1945). That era was characterized by robust federal investment, a faith in the university as the engine of basic research, and a compact that traded public funding for scientific autonomy and the promise of long-term societal benefit. It was a climate conducive to the slow, deliberate, and often unpredictable growth of knowledge, nurtured by a diverse ecosystem of human researchers — the vital “seed stock” of intellectual discovery.

But that high-pressure system is collapsing. A brutal, unyielding cold front of academic defunding has swept across the nation, a consequence of shifting political priorities, populist resentment, and a calculated assault on the university as an institution perceived as hostile to certain political agendas. This is not merely a belt-tightening exercise; it is, for all intents and purposes, the dismantling of Vannevar Bush’s Compact, the end of the era of “big government”-funded Wissenschaft. Funding streams for basic research are dwindling, grant applications face increasingly long odds, and the financial precarity of academic careers deters the brightest minds. The human capital necessary for sustained, fundamental inquiry is beginning to wither.

Simultaneously, a warm, moisture-laden airmass is rapidly advancing: the astonishing rise of AI-based research tools. Powered by vast datasets and sophisticated algorithms, these tools promise to revolutionize every stage of the research process – from literature review and data analysis to hypothesis generation and the drafting of scholarly texts. As a recent New Yorker piece on AI and the humanities suggests, these AI engines can already generate deep research and coherent texts on virtually any subject, seemingly within moments. They offer the prospect of unprecedented efficiency, speed, and scale in the production of scholarly output.

The collision of these two epochal weather systems — the brutal cold front of academic defunding and the warm, expansive airmass of AI-based research tools — is creating an atmospheric instability unlike anything the world of scholarship has ever witnessed. Along the front where these forces meet, a series of powerful and unpredictable tornados are beginning to touch down, reshaping the terrain of knowledge production in real-time…(More)”.

Real-time prices, real results: comparing crowdsourcing, AI, and traditional data collection


Article by Julius Adewopo, Bo Andree, Zacharey Carmichael, Steve Penson, Kamwoo Lee: “Timely, high-quality food price data is essential for shock responsive decision-making. However, in many low- and middle-income countries, such data is often delayed, limited in geographic coverage, or unavailable due to operational constraints. Traditional price monitoring, which relies on structured surveys conducted by trained enumerators, is often constrained by challenges related to cost, frequency, and reach.

To help overcome these limitations, the World Bank launched the Real-Time Prices (RTP) data platform. This effort provides monthly price data using a machine learning framework. The models combine survey results with predictions derived from observations in nearby markets and related commodities. This approach helps fill gaps in local price data across a basket of goods, enabling real-time monitoring of inflation dynamics even when survey data is incomplete or irregular.

In parallel, new approaches—such as citizen-submitted (crowdsourced) data—are being explored to complement conventional data collection methods. These crowdsourced data were recently published in a Nature Scientific Data paper. While the adoption of these innovations is accelerating, maintaining trust requires rigorous validation.

newly published study in PLOS compares the two emerging methods with the traditional, enumerator-led gold standard, providing  new evidence that both crowdsourced and AI-imputed prices can serve as credible, timely alternatives to traditional ground-truth data collection—especially in contexts where conventional methods face limitations…(More)”.