Stefaan Verhulst
Paper by Giliberto Capano, et al: “Theories of the policy process understand the dynamics of policymaking as the result of the interaction of structural and agency variables. While these theories tend to conceptualize structural variables in a careful manner, agency (i.e. the actions of individual agents, like policy entrepreneurs, policy leaders, policy brokers, and policy experts) is left as a residual piece in the puzzle of the causality of change and stability. This treatment of agency leaves room for conceptual overlaps, analytical confusion and empirical shortcomings that can complicate the life of the empirical researcher and, most importantly, hinder the ability of theories of the policy process to fully address the drivers of variation in policy dynamics. Drawing on Merton’s concept of function, this article presents a novel theorization of agency in the policy process. We start from the assumption that agency functions are a necessary component through which policy dynamics evolve. We then theorise that agency can fulfil four main functions – steering, innovation, intermediation and intelligence – that need to be performed, by individual agents, in any policy process through four patterns of action – leadership, entrepreneurship, brokerage and knowledge accumulation – and we provide a roadmap for operationalising and measuring these concepts. We then demonstrate what can be achieved in terms of analytical clarity and potential theoretical leverage by applying this novel conceptualisation to two major policy process theories: the Multiple Streams Framework (MSF) and the Advocacy Coalition Framework (ACF)…(More)”.
Book by Alejandra Soriano Diaz: “Information is not only a human-fundamental right, but it has been shaped as a pillar for the exercise of other human rights around the world. It is the path for bringing to account authorities and other powerful actors before the people, who are, for all purposes, the actual owners of public data.
Providing information about public decisions that have the potential to significantly impact a community is vital to modern democracy. This book explores the forms in which individuals and collectives are able to voice their opinions and participate in public decision-making when long-lasting effects are at stake, on present and future generations. The strong correlation between the right to access public information and the enjoyment of civil and political rights, as well as economic and environmental rights, emphasizes their interdependence.
This study raises a number of important questions to mobilize towards openness and empowerment of people’s right of ownership of their public information…(More)”.
Book edited by Kostina Prifti, Esra Demir, Julia Krämer, Klaus Heine, and Evert Stamhuis: “This book explores the structure and frameworks of digital governance, focusing on various regulatory patterns, with the aim of tackling the disruptive impact of artificial intelligence (AI) technologies. Addressing the various challenges posed by AI technologies, this book explores potential avenues for crafting legal remedies and solutions, spanning liability of AI, platform governance, and the implications for data protection and privacy…(More)”.
Paper by Takahiro Yabe et al: “Disruptions, such as closures of businesses during pandemics, not only affect businesses and amenities directly but also influence how people move, spreading the impact to other businesses and increasing the overall economic shock. However, it is unclear how much businesses depend on each other during disruptions. Leveraging human mobility data and same-day visits in five US cities, we quantify dependencies between points of interest encompassing businesses, stores and amenities. We find that dependency networks computed from human mobility exhibit significantly higher rates of long-distance connections and biases towards specific pairs of point-of-interest categories. We show that using behaviour-based dependency relationships improves the predictability of business resilience during shocks by around 40% compared with distance-based models, and that neglecting behaviour-based dependencies can lead to underestimation of the spatial cascades of disruptions. Our findings underscore the importance of measuring complex relationships in patterns of human mobility to foster urban economic resilience to shocks…(More)”.
Paper by Kiley Seymour et al: “Despite the dramatic rise of surveillance in our societies, only limited research has examined its effects on humans. While most research has focused on voluntary behaviour, no study has examined the effects of surveillance on more fundamental and automatic aspects of human perceptual awareness and cognition. Here, we show that being watched on CCTV markedly impacts a hardwired and involuntary function of human sensory perception—the ability to consciously detect faces. Using the method of continuous flash suppression (CFS), we show that when people are surveilled (N = 24), they are quicker than controls (N = 30) to detect faces. An independent control experiment (N = 42) ruled out an explanation based on demand characteristics and social desirability biases. These findings show that being watched impacts not only consciously controlled behaviours but also unconscious, involuntary visual processing. Our results have implications concerning the impacts of surveillance on basic human cognition as well as public mental health…(More)”.
Book edited by Melodena Stephens, Raed Awamleh and Frederic Sicre: “Anticipatory Governance is the systemic process of future shaping built on the understanding that the future is not a continuation of the past or present, thus making foresight a complex task requiring the engagement of the whole of government with its constituents in a constructive and iterative manner to achieve collective intelligence. Effective anticipatory governance amplifies the fundamental properties of agile government to build trust, challenge assumptions, and reach consensus. Moreover, anticipatory governance sets the foundation to adapt to exponential change. This seismic shift in the governance environment should lead to urgent rethinking of the ways and means governments and large corporate players formulate strategies, design processes, develop human capital and shape instiutional culture to achieve public value.
From a long-term multigenerational perspective, anticipatory governance is a key component to ensure guardrails for the future. Systems thinking is needed to harness our collective intelligence, by tapping into knowledge trapped within nations, organizations, and people. Many of the wicked problems governments and corporations are grappling with like artificial intelligence applications and ethics, climate change, refugee migration, education for future skills, and health care for all, require a “system of systems”, or anticipatory governance.
Yet, no matter how much we invest in foresight and shaping the future, we still need an agile government approach to manage unintended outcomes and people’s expectations. Crisis management which begins with listening to weak signals, sensemaking, intelligence management, reputation enhancement, and public value alignment and delivery, is critical. This book dives into the theory and practice of anticipatory governance and sets the agenda for future research…(More)”
Paper by Seliem El-Sayed, Ilona Kickbusch & Barbara Prainsack: “Most data governance frameworks are designed to protect the individuals from whom data originates. However, the impacts of digital practices extend to a broader population and are embedded in significant power asymmetries within and across nations. Further, inequities in digital societies impact everyone, not just those directly involved. Addressing these challenges requires an approach which moves beyond individual data control and is grounded in the values of equity and a just contribution of benefits and risks from data use. Solidarity-based data governance (in short: data solidarity), suggests prioritising data uses over data type and proposes that data uses that generate public value should be actively facilitated, those that generate significant risks and harms should be prohibited or strictly regulated, and those that generate private benefits with little or no public value should be ‘taxed’ so that profits generated by corporate data users are reinvested in the public domain. In the context of global health data governance, the public value generated by data use is crucial. This contribution clarifies the meaning, importance, and potential of public value within data solidarity and outlines methods for its operationalisation through the PLUTO tool, specifically designed to assess the public value of data uses…(More)”.
Article by Erika DeBenedictis, Ben Andrew & Pete Kelly: “In the age of Artificial Intelligence (AI), large high-quality datasets are needed to move the field of life science forward. However, the research community lacks strategies to incentivize collaboration on high-quality data acquisition and sharing. The government should fund collaborative roadmapping, certification, collection, and sharing of large, high-quality datasets in life science. In such a system, nonprofit research organizations engage scientific communities to identify key types of data that would be valuable for building predictive models, and define quality control (QC) and open science standards for collection of that data. Projects are designed to develop automated methods for data collection, certify data providers, and facilitate data collection in consultation with researchers throughout various scientific communities. Hosting of the resulting open data is subsidized as well as protected by security measures. This system would provide crucial incentives for the life science community to identify and amass large, high-quality open datasets that will immensely benefit researchers…(More)”.
Essay by Virginia Postrel: “When the future arrived, it felt… ordinary. What happened to the glamour of tomorrow?
Progress used to be glamorous. For the first two thirds of the twentieth-century, the terms modern, future, and world of tomorrow shimmered with promise.
Glamour is more than a synonym for fashion or celebrity, although these things can certainly be glamorous. So can a holiday resort, a city, or a career. The military can be glamorous, as can technology, science, or the religious life. It all depends on the audience. Glamour is a form of communication that, like humor, we recognize by its characteristic effect. Something is glamorous when it inspires a sense of projection and longing: if only . . .
Whatever its incarnation, glamour offers a promise of escape and transformation. It focuses deep, often unarticulated longings on an image or idea that makes them feel attainable. Both the longings – for wealth, happiness, security, comfort, recognition, adventure, love, tranquility, freedom, or respect – and the objects that represent them vary from person to person, culture to culture, era to era. In the twentieth-century, ‘the future’ was a glamorous concept…
Much has been written about how and why culture and policy repudiated the visions of material progress that animated the first half of the twentieth-century, including a special issue of this magazine inspired by J Storrs Hall’s book Where Is My Flying Car? The subtitle of James Pethokoukis’s recent book The Conservative Futurist is ‘How to create the sci-fi world we were promised’. Like Peter Thiel’s famous complaint that ‘we wanted flying cars, instead we got 140 characters’, the phrase captures a sense of betrayal. Today’s techno-optimism is infused with nostalgia for the retro future.
But the most common explanations for the anti-Promethean backlash fall short. It’s true but incomplete to blame the environmental consciousness that spread in the late sixties…
How exactly today’s longings might manifest themselves, whether in glamorous imagery or real-life social evolution, is hard to predict. But one thing is clear: For progress to be appealing, it must offer room for diverse pursuits and identities, permitting communities with different commitments and values to enjoy a landscape of pluralism without devolving into mutually hostile tribes. The ideal of the one best way passed long ago. It was glamorous in its day but glamour is an illusion…(More)”.
Article by Duncan C. McElfresh: “Say you run a hospital and you want to estimate which patients have the highest risk of deterioration so that your staff can prioritize their care1. You create a spreadsheet in which there is a row for each patient, and columns for relevant attributes, such as age or blood-oxygen level. The final column records whether the person deteriorated during their stay. You can then fit a mathematical model to these data to estimate an incoming patient’s deterioration risk. This is a classic example of tabular machine learning, a technique that uses tables of data to make inferences. This usually involves developing — and training — a bespoke model for each task. Writing in Nature, Hollmann et al.report a model that can perform tabular machine learning on any data set without being trained specifically to do so.
Tabular machine learning shares a rich history with statistics and data science. Its methods are foundational to modern artificial intelligence (AI) systems, including large language models (LLMs), and its influence cannot be overstated. Indeed, many online experiences are shaped by tabular machine-learning models, which recommend products, generate advertisements and moderate social-media content3. Essential industries such as healthcare and finance are also steadily, if cautiously, moving towards increasing their use of AI.
Despite the field’s maturity, Hollmann and colleagues’ advance could be revolutionary. The authors’ contribution is known as a foundation model, which is a general-purpose model that can be used in a range of settings. You might already have encountered foundation models, perhaps unknowingly, through AI tools, such as ChatGPT and Stable Diffusion. These models enable a single tool to offer varied capabilities, including text translation and image generation. So what does a foundation model for tabular machine learning look like?
Let’s return to the hospital example. With spreadsheet in hand, you choose a machine-learning model (such as a neural network) and train the model with your data, using an algorithm that adjusts the model’s parameters to optimize its predictive performance (Fig. 1a). Typically, you would train several such models before selecting one to use — a labour-intensive process that requires considerable time and expertise. And of course, this process must be repeated for each unique task.
