Stefaan Verhulst
Report by TIAL: “Today more than ever, legitimacy is a vital resource for institutions seeking to lead and sustain impactful change. Yet, it can be elusive.
What does it truly mean for an institution to be legitimate? This publication delves into legitimacy as both a practical asset and a dynamic process, offering institutional entrepreneurs the tools to understand, build, and sustain it over time.
Legitimacy is not a static quality, nor is it purely theoretical. Instead, it’s grounded in the beliefs of those who interact with or are governed by an institution. These beliefs shape whether people view an institution’s authority as rightful and worth supporting. Drawing from social science research and real-world insights, this publication provides a framework to help institutional entrepreneurs address one of the most important challenges of institutional design: ensuring their legitimacy is sufficient to achieve their goals.
The paper emphasizes that legitimacy is relational and contextual. Institutions gain it through three primary sources: outcomes (delivering results), fairness (ensuring just processes), and correct procedures (following accepted norms). However, the need for legitimacy varies depending on the institution’s size, scope, and mission. For example, a body requiring elite approval may need less legitimacy than one relying on mass public trust.
Legitimacy is also dynamic—it ebbs and flows in response to external factors like competition, crises, and shifting societal narratives. Institutional entrepreneurs must anticipate these changes and actively manage their strategies for maintaining legitimacy. This publication highlights actionable steps for doing so, from framing mandates strategically to fostering public trust through transparency and communication.
By treating legitimacy as a resource that evolves over time, institutional entrepreneurs can ensure their institutions remain relevant, trusted, and effective in addressing pressing societal challenges.
Key takeaways
- Legitimacy is the belief by an audience that an institution’s authority is rightful.
- Institutions build legitimacy through outcomes, fairness, and correct procedures.
- The need for legitimacy depends on an institution’s scope and mission.
- Legitimacy is dynamic and shaped by external factors like crises and competition.
- A portfolio approach to legitimacy—balancing outcomes, fairness, and procedure—is more resilient.
- Institutional entrepreneurs must actively manage perceptions and adapt to changing contexts.
- This publication offers practical frameworks to help institutional entrepreneurs build and sustain legitimacy…(More)”.
Article by Hao Cui and Taha Yasseri: “Imagine a large city recovering from a devastating hurricane. Roads are flooded, the power is down, and local authorities are overwhelmed. Emergency responders are doing their best, but the chaos is massive.
AI-controlled drones survey the damage from above, while intelligent systems process satellite images and data from sensors on the ground and air to identify which neighbourhoods are most vulnerable.
Meanwhile, AI-equipped robots are deployed to deliver food, water and medical supplies into areas that human responders can’t reach. Emergency teams, guided and coordinated by AI and the insights it produces, are able to prioritise their efforts, sending rescue squads where they’re needed most.
This is no longer the realm of science fiction. In a recent paper published in the journal Patterns, we argue that it’s an emerging and inevitable reality.
Collective intelligence is the shared intelligence of a group or groups of people working together. Different groups of people with diverse skills, such as firefighters and drone operators, for instance, work together to generate better ideas and solutions. AI can enhance this human collective intelligence, and transform how we approach large-scale crises. It’s a form of what’s called hybrid collective intelligence.
Instead of simply relying on human intuition or traditional tools, experts can use AI to process vast amounts of data, identify patterns and make predictions. By enhancing human decision-making, AI systems offer faster and more accurate insights – whether in medical research, disaster response, or environmental protection.
AI can do this, by for example, processing large datasets and uncovering insights that would take much longer for humans to identify. AI can also get involved in physical tasks. In manufacturing, AI-powered robots can automate assembly lines, helping improve efficiency and reduce downtime.
Equally crucial is information exchange, where AI enhances the flow of information, helping human teams coordinate more effectively and make data-driven decisions faster. Finally, AI can act as social catalysts to facilitate more effective collaboration within human teams or even help build hybrid teams of humans and machines working alongside one another…(More)”.
Article by Lizzi C. Lee: “Chinese firms generate staggering amounts of data daily, from ride-hailing trips to online shopping transactions. A recent policy allowed Chinese companies to record data as assets on their balance sheets, the first such regulation in the world, paving the way for data to be traded in a marketplace and boost company valuations.
But uptake has been slow. When China Unicom, one of the world’s largest mobile operators, reported its earnings recently, eagle-eyed accountants spotted that the company had listed 204 million yuan ($28 million) in data assets on its balance sheet. The state-owned operator was the first Chinese tech giant to take advantage of the Ministry of Finance’s new corporate data policy, which permits companies to classify data as inventory or intangible assets.
“No other country is trying to do this on a national level. It could drive global standards of data management and accounting,” Ran Guo, an affiliated researcher at the Asia Society Policy Institute specializing in data governance in China, told Rest of World.
In 2023 alone, China generated 32.85 zettabytes — more than 27% of the global total, according to a government survey. To put that in perspective, storing this volume on standard 1-terabyte hard drives would require more than 32 billion units….Tech companies that are data-rich are well-positioned tobenefit from logging data as assets to turn the formalized assets into tradable commodities, said Guo. But companies must first invest in secure storage and show that the data is legally obtained in order to meet strict government rules on data security.
“This can be costly and complex,” he said. “Not all data qualifies as an asset, and companies must meet stringent requirements.”
Even China Unicom, a state-owned enterprise, is likely complying with the new policy due to political pressure rather than economic incentive, said Guo, who conducted field research in China last year on the government push for data resource development. The telecom operator did not respond to a request for comment.
Private technology companies in China, meanwhile, tend to be protective of their data. A Chinese government statement in 2022 pushed private enterprises to “open up their data.” But smaller firms could lack the resources to meet the stringent data storage and consumer protection standards, experts and Chinese tech company employees told Rest of World...(More)”.
Essay by the Transition Collective: “Government organizations and their leaders are in a pinch. They are caught between pressures from politicians, citizens and increasingly complex external environments on the one hand — and from civil servants calling for new ways of working, thriving and belonging on the other hand. They have to enable meaningful, joined-up and efficient services for people, leveraging digital and physical resources, while building an attractive organizational culture. Indeed, the challenge is to build systems as human as the people they are intended to serve.
While this creates massive challenges for public sector organizations, this is also an opportunity to reimagine our institutions to meet the challenges of today and the future. To succeed, we must not only think about other models of organization — we also have to think of other ways of changing them.
Traditionally, we think of the organization as something static, a goal we arrive at or a fixed model we decide upon. If asked to describe their organization, most civil servants will point to an organigram — and more often than not it will consist of a number of boxes and lines, ordered in a hierarchy.
But in today’s world of complex challenges, accelerated frequency of change and dynamic interplay between the public sector and its surroundings, such a fixed model is less and less fit for the purposes it must fulfill. Not only does it not allow the collective intelligence and creativity of the organization’s members to be fully unleashed, it also does not allow for the speed and adaptability required by today’s turbulent environment. It does not allow for truly joined up, meaningful human services.
Unfreezing the organization
Rather than thinking mainly about models and forms, we should think of organizational design as an act or a series of actions. In other words, we should think about the organization not just as a what but also as a how: Less as a set of boxes describing a power hierarchy, and more as a set of living, organic roles and relationships. We need to thaw up our organizations from their frozen state — and keep them warmer and more fluid.
In this piece, we suggest that many efforts to reimagine public sector organizations have failed because the challenge of transforming an organization has been underestimated. We draw on concrete experiences from working with international and Danish public sector institutions, in particular in health and welfare services.
We propose a set of four approaches which, taken together, can support the work of redesigning organizations to be more ambitious, free, human, creative and self-managing — and thus better suited to meet the ever more complex challenges they are faced with…(More)”.
Blog by dynomight: “Because everyone uses Bayesian reasoning all the time, even if they don’t think of it that way. Arguably, we’re born Bayesian and do it instinctively. It’s normal and natural and—I daresay—almost boring. “Bayesian reasoning” is just a slight formalization of everyday thought.
It’s not a trend. It’s forever. But it’s forever like arithmetic is forever: Strange to be obsessed with it, but really strange to make fun of someone for using it.
Here, I’ll explain what Bayesian reasoning is, why it’s so fundamental, why people argue about it, and why much of that controversy is ultimately a boring semantic debate of no interest to an enlightened person like yourself. Then, for the haters, I’ll give some actually good reasons to be skeptical about how useful it is in practice.
I won’t use any equations. That’s not because I don’t think you can take it, but Bayesian reasoning isn’t math. It’s a concept. The typical explanations use lots of math and kind of gesture around the concept, but never seem to get to the core of it, which I think leads people to miss the forest for the trees…(More)”.
Article by Jessica Hilburn: “This land was made for you and me, and so was the data collected with our taxpayer dollars. Open data is data that is accessible, shareable, and able to be used by anyone. While any person, company, or organization can create and publish open data, the federal and state governments are by far the largest providers of open data.
President Barack Obama codified the importance of government-created open data in his May 9, 2013, executive order as a part of the Open Government Initiative. This initiative was meant to “ensure the public trust and establish a system of transparency, public participation, and collaboration” in furtherance of strengthening democracy and increasing efficiency. The initiative also launched Project Open Data (since replaced by the Resources.data.gov platform), which documented best practices and offered tools so government agencies in every sector could open their data and contribute to the collective public good. As has been made readily apparent, the era of public good through open data is now under attack.
Immediately after his inauguration, President Donald Trump signed a slew of executive orders, many of which targeted diversity, equity, and inclusion (DEI) for removal in federal government operations. Unsurprisingly, a large number of federal datasets include information dealing with diverse populations, equitable services, and inclusion of marginalized groups. Other datasets deal with information on topics targeted by those with nefarious agendas—vaccination rates, HIV/AIDS, and global warming, just to name a few. In the wake of these executive orders, datasets and website pages with blacklisted topics, tags, or keywords suddenly disappeared—more than 8,000 of them. In addition, President Trump fired the National Archivist, and top National Archives and Records Administration officials are being ousted, putting the future of our collective history at enormous risk.
While it is common practice to archive websites and information in the transition between administrations, it is unprecedented for the incoming administration to cull data altogether. In response, unaffiliated organizations are ramping up efforts to separately archive data and information for future preservation and access. Web scrapers are being used to grab as much data as possible, but since this method is automated, data requiring a login or bot challenger (like a captcha) is left behind. The future information gap that researchers will be left to grapple with could be catastrophic for progress in crucial areas, including weather, natural disasters, and public health. Though there are efforts to put out the fire, such as the federal order to restore certain resources, the people’s library is burning. The losses will be permanently felt…Data is a weapon, whether we like it or not. Free and open access to information—about democracy, history, our communities, and even ourselves—is the foundation of library service. It is time for anyone who continues to claim that libraries are not political to wake up before it is too late. Are libraries still not political when the Pentagon barred library access for tens of thousands of American children attending Pentagon schools on military bases while they examined and removed supposed “radical indoctrination” books? Are libraries still not political when more than 1,000 unique titles are being targeted for censorship annually, and soft censorship through preemptive restriction to avoid controversy is surely occurring and impossible to track? It is time for librarians and library workers to embrace being political.
In a country where the federal government now denies that certain people even exist, claims that children are being indoctrinated because they are being taught the good and bad of our nation’s history, and rescinds support for the arts, humanities, museums, and libraries, there is no such thing as neutrality. When compassion and inclusion are labeled the enemy and the diversity created by our great American experiment is lambasted as a social ill, claiming that libraries are neutral or apolitical is not only incorrect, it’s complicit. To update the quote, information is the weapon in the war of ideas. Librarians are the stewards of information. We don’t want to be the Americans who protested in 1933 at the first Nazi book burnings and then, despite seeing the early warning signs of catastrophe, retreated into the isolation of their own concerns. The people’s library is on fire. We must react before all that is left of our profession is ash…(More)”.
Paper by Devansh Saxena, et al: “Local and federal agencies are rapidly adopting AI systems to augment or automate critical decisions, efficiently use resources, and improve public service delivery. AI systems are being used to support tasks associated with urban planning, security, surveillance, energy and critical infrastructure, and support decisions that directly affect citizens and their ability to access essential services. Local governments act as the governance tier closest to citizens and must play a critical role in upholding democratic values and building community trust especially as it relates to smart city initiatives that seek to transform public services through the adoption of AI. Community-centered and participatory approaches have been central for ensuring the appropriate adoption of technology; however, AI innovation introduces new challenges in this context because participatory AI design methods require more robust formulation and face higher standards for implementation in the public sector compared to the private sector. This requires us to reassess traditional methods used in this space as well as develop new resources and methods. This workshop will explore emerging practices in participatory algorithm design – or the use of public participation and community engagement – in the scoping, design, adoption, and implementation of public sector algorithms…(More)”.
Paper by Pietro Gennari: “Over the last few years, the private sector has become a primary generator of data due to widespread digitisation of the economy and society, the use of social media platforms, and advancements of technologies like the Internet of Things and AI. Unlike traditional sources, these new data streams often offer real-time information and unique insights into people’s behaviour, social dynamics, and economic trends. However, the proprietary nature of most private sector data presents challenges for public access, transparency, and governance that have led to fragmented, often conflicting, data governance arrangements worldwide. This lack of coherence can exacerbate inequalities, limit data access, and restrict data’s utility as a global asset.
Within this context, data equity has emerged as one of the key principles at the basis of any proposal of new data governance framework. The term “data equity” refers to the fair and inclusive access, use, and distribution of data so that it benefits all sections of society, regardless of socioeconomic status, race, or geographic location. It involves making sure that the collection, processing, and use of data does not disproportionately benefit or harm any particular group and seeks to address disparities in data access and quality that can perpetuate social and economic inequalities. This is important because data systems significantly influence access to resources and opportunities in society. In this sense, data equity aims to correct imbalances that have historically affected various groups and to ensure that decision-making based on data does not perpetuate these inequities…(More)”.
Toolkit by Maria Claudia Bodino, Nathan da Silva Carvalho, Marcelo Cogo, Arianna Dafne Fini Storchi, and Stefaan Verhulst: “Despite the abundance of data, the excitement around AI, and the potential for transformative insights, many public administrations struggle to translate data into actionable strategies and innovations.
Public servants working with data-related initiatives, need practical, easy-to-use resources designed to enhance the management of data innovation initiatives.
In order to address these needs, the iLab of DG DIGIT from the European Commission is developing an initial set of practical tools designed to facilitate and enhance the implementation of data-driven initiatives. The main building blocks of the first version of the of the Digital Innovation Toolkit include:
- A Repository of educational materials and resources on the latest data innovation approaches from public sector, academia, NGOs and think tanks
- An initial set of practical resources, some examples:
- Workshop Templates to offer structured formats for conducting productive workshops that foster collaboration, ideation, and problem-solving.
- Checklists to ensure that all data journey aspects and steps are properly assessed.
- Interactive Exercises to engage team members in hands-on activities that build skills and facilitate understanding of key concepts and methodologies.
- Canvas Models to provide visual frameworks for planning and brainstorming….(More)”.

Paper by Sara Marcucci, Stefaan Verhulst and María Esther Cervantes: “The various global refugee and migration events of the last few years underscore the need for advancing anticipatory strategies in migration policy. The struggle to manage large inflows (or outflows) highlights the demand for proactive measures based on a sense of the future. Anticipatory methods, ranging from predictive models to foresight techniques, emerge as valuable tools for policymakers. These methods, now bolstered by advancements in technology and leveraging nontraditional data sources, can offer a pathway to develop more precise, responsive, and forward-thinking policies.
This paper seeks to map out the rapidly evolving domain of anticipatory methods in the realm of migration policy, capturing the trend toward integrating quantitative and qualitative methodologies and harnessing novel tools and data. It introduces a new taxonomy designed to organize these methods into three core categories: Experience-based, Exploration-based, and Expertise-based. This classification aims to guide policymakers in selecting the most suitable methods for specific contexts or questions, thereby enhancing migration policies…(More)”
