Blog by UK Policy Lab: “Systems change is hard work, and it takes time. The reality is that no single system map or tool is enough to get you from point A to point B, from system now to system next. Over the last year, we have explored the latest in systems change theory and applied it to policymaking. In this four part blog series, we share our reflections on the wealth of knowledge we’ve gained working on intractable issues surrounding how support is delivered for people experiencing multiple disadvantage. Along the way, we realised that we need to make new tools to support policy teams to do this deep work in the future, and to see afresh the limitations of existing mental models for change and transformation.
Policy Lab has previously written about systems mapping as a useful process for understanding the interconnected nature of factors and actors that make up policy ecosystems. Here, we share our latest experimentation on how we can generate practical ideas for long-lasting and systemic change.
This blog includes:
An overview of what we did on our latest project – including the policy context, systems change frameworks we experimented with, and the bespoke project framework we created;
Our reflections on how we carried out the project;
A matrix which provides a practical guide for you to use this approach in your own work…(More)”.
Blog by Sara Marcucci, and Stefaan Verhulst: “…In this week’s blog post, we delineate a taxonomy of anticipatory methods, categorizing them into three distinct sub-categories: Experience-based, Exploration-based, and Expertise-based methods. Our focus will be on what the practical applications of these methods are and how both traditional and non-traditional data sources play a pivotal role within each of these categories. …Experience-based methods in the realm of migration policy focus on gaining insights from the lived experiences of individuals and communities involved in migration processes. These methods allow policymakers to tap into the lived experiences, challenges, and aspirations of individuals and communities, fostering a more empathetic and holistic approach to policy development.
Through the lens of people’s experiences and viewpoints, it is possible to create and explore a multitude of scenarios. This in-depth exploration provides policy makers with a comprehensive understanding of these potential pathways, which, in turn, inform their decision-making process…(More)”.
Article by Regina Ta and Nicol Turner Lee: “Prompt-based generative artificial intelligence (AI) tools are quickly being deployed for a range of use cases, from writing emails and compiling legal cases to personalizing research essays in a wide range of educational, professional, and vocational disciplines. But language is not monolithic, and opportunities may be missed in developing generative AI tools for non-standard languages and dialects. Current applications often are not optimized for certain populations or communities and, in some instances, may exacerbate social and economic divisions. As noted by the Austrian linguist and philosopher Ludwig Wittgenstein, “The limits of my language mean the limits of my world.” This is especially true today, when the language we speak can change how we engage with technology, and the limits of our online vernacular can constrain the full and fair use of existing and emerging technologies.
As it stands now, the majority of the world’s speakers are being left behind if they are not part of one of the world’s dominant languages, such as English, French, German, Spanish, Chinese, or Russian. There are over 7,000 languages spoken worldwide, yet a plurality of content on the internet is written in English, with the largest remaining online shares claimed by Asian and European languages like Mandarin or Spanish. Moreover, in the English language alone, there are over 150 dialects beyond “standard” U.S. English. Consequently, large language models (LLMs) that train AI tools, like generative AI, rely on binary internet data that serve to increase the gap between standard and non-standard speakers, widening the digital language divide.
Among sociologists, anthropologists, and linguists, language is a source of power and one that significantly influences the development and dissemination of new tools that are dependent upon learned, linguistic capabilities. Depending on where one sits within socio-ethnic contexts, native language can internally strengthen communities while also amplifying and replicating inequalities when coopted by incumbent power structures to restrict immigrant and historically marginalized communities. For example, during the transatlantic slave trade, literacy was a weapon used by white supremacists to reinforce the dependence of Blacks on slave masters, which resulted in many anti-literacy laws being passed in the 1800s in most Confederate states…(More)”.
Article by Aaron Sankin and Surya Mattu: “A software company sold a New Jersey police department an algorithm that was right less than 1% of the time
Crime predictions generated for the police department in Plainfield, New Jersey, rarely lined up with reported crimes, an analysis by The Markup has found, adding new context to the debate over the efficacy of crime prediction software.
Geolitica, known as PredPol until a 2021 rebrand, produces software that ingests data from crime incident reports and produces daily predictions on where and when crimes are most likely to occur.
We examined 23,631 predictions generated by Geolitica between Feb. 25 to Dec. 18, 2018 for the Plainfield Police Department (PD). Each prediction we analyzed from the company’s algorithm indicated that one type of crime was likely to occur in a location not patrolled by Plainfield PD. In the end, the success rate was less than half a percent. Fewer than 100 of the predictions lined up with a crime in the predicted category, that was also later reported to police.
Diving deeper, we looked at predictions specifically for robberies or aggravated assaults that were likely to occur in Plainfield and found a similarly low success rate: 0.6 percent. The pattern was even worse when we looked at burglary predictions, which had a success rate of 0.1 percent.
“Why did we get PredPol? I guess we wanted to be more effective when it came to reducing crime. And having a prediction where we should be would help us to do that. I don’t know that it did that,” said Captain David Guarino of the Plainfield PD. “I don’t believe we really used it that often, if at all. That’s why we ended up getting rid of it.”…(More)’.
Blog by Sara Marcucci and Stefaan Verhulst: “…Migration is a dynamic phenomenon influenced by a variety of factors. As migration policies strive to keep pace with an ever-changing landscape, anticipating trends becomes increasingly pertinent. Traditionally, in the realm of anticipatory methods, a clear demarcation existed between foresight and forecast.
Forecast predominantly relies on quantitative techniques to predict future trends, utilizing historical data, mathematical models, and statistical analyses to provide numerical predictions applicable to the short-to-medium term, seeking to facilitate expedited policy making, resource allocation, and logistical planning.
Foresight methodologies conventionally leaned on qualitative insights to explore future possibilities, employing expert judgment, scenario planning, and holistic exploration to envision potential future scenarios. This qualitative approach has been characterized by a more long-term perspective, which seeks to explore a spectrum of potential futures in the long run.
More recently, this once-clear distinction between quantitative forecasting and qualitative foresight has begun to blur. New methodologies that embrace a mixed-method approach are emerging, challenging traditional paradigms and offering new pathways for understanding complex phenomena. Despite the evolution and the growing interest in these novel approaches, there currently exists no comprehensive taxonomy to guide practitioners in selecting the most appropriate method for their given objective. Moreover, due to the state-of-the-art, there is a need for primers delving into these modern methodologies, filling a gap in knowledge and resources that practitioners can leverage to enhance their forecasting and foresight endeavors…(More)”.
Article by Claudette Salinas Leyva et al: “Many of our institutions are focused on the short term. Whether corporations, government bodies, or even nonprofits, they tend to prioritize immediate returns and discount long-term value and sustainability. This myopia is behind planetary crises such as climate change and biodiversity loss and contributes to decision-making that harms the wellbeing of communities.
Policymakers worldwide are beginning to recognize the importance of governing for the long term. The United Nations is currently developing a Declaration on Future Generations to codify this approach. This collection of case studies profiles community-level institutions rooted in Indigenous traditions that focus on governing for the long term and preserving the interests of future generations…(More)”.
Article by Stefaan G. Verhulst: “We live at a moment of perhaps unprecedented global upheaval. From climate change to pandemics, from war to political disharmony, misinformation, and growing social inequality, policy and social change-makers today face not only new challenges but new types of challenges. In our increasingly complex and interconnected world, existing systems and institutions of governance, marked by hierarchical decision-making, are increasingly being replaced by overlapping nodes of multi-sector decision-making.
Data is proving critical to these new forms of decision-making, along with associated (and emerging) phenomena such as advanced analytics, machine learning, and artificial intelligence. Yet while the importance of data intelligence for policymakers is now widely recognized, there remain multiple challenges to operationalizing that insight–i.e., to move from data intelligence to decision intelligence.
DALL-E generated
In what follows, we explain what we mean by decision intelligence, and discuss why it matters. We then present six obstacles to better decision intelligence–challenges that prevent policymakers and others from translating insights into action. Finally, we end by offering one possible solution to these challenges: the concept of decision accelerator labs, operating on a hub and spoke model, and offering an innovative, interdisciplinary platform to facilitate the development of evidence-based, targeted solutions to public problems and dilemmas…(More)”.
Article by Sara Marcucci, Stefaan Verhulst, María Esther Cervantes, Elena Wüllhorst: “This blog is the first in a series that will be published weekly, dedicated to exploring innovative anticipatory methods for migration policy. Over the coming weeks, we will delve into various aspects of these methods, delving into their value, challenges, taxonomy, and practical applications.
This first blog serves as an exploration of the value proposition and challenges inherent in innovative anticipatory methods for migration policy. We delve into the various reasons why these methods hold promise for informing more resilient, and proactive migration policies. These reasons include evidence-based policy development, enabling policymakers to ground their decisions in empirical evidence and future projections. Decision-takers, users, and practitioners can benefit from anticipatory methods for policy evaluation and adaptation, resource allocation, the identification of root causes, and the facilitation of humanitarian aid through early warning systems. However, it’s vital to acknowledge the challengesassociated with the adoption and implementation of these methods, ranging from conceptual concerns such as fossilization, unfalsifiability, and the legitimacy of preemptive intervention, to practical issues like interdisciplinary collaboration, data availability and quality, capacity building, and stakeholder engagement. As we navigate through these complexities, we aim to shed light on the potential and limitations of anticipatory methods in the context of migration policy, setting the stage for deeper explorations in the coming blogs of this series…(More)”.
Blog by Darrel Ronald: “The definition for urban digital twins is too vague — so it is important to create a clearer picture of the types of urban digital twins that are available. Not all digital twins are the same and each one comes with features and capabilities, strengths and weakness, as well as appropriate and inappropriate use cases….
As shown in my proposed Urban Digital Twin Taxonomy above, I propose that we classify these products first based on their Main Functionality (the Use Case), then based on their Technology Platform. I highlight some of main products within the different categories and their product scope. Next, I detail the different types of twins and offer some brief strengths and weaknesses for each type. This taxonomy could apply to other industries such as architecture or manufacturing, but it is specifically applied to cities and urban development projects.
Article by Stefaan G. Verhulst and Artur Kluz: “Technology has always played a crucial role in human history, both in winning wars and building peace. Even Leonardo da Vinci, the genius of the Renaissance time, in his 1482 letter to Ludovico Il Moro Sforza, Duke of Milan promised to invent new technological warfare for attack or defense. While serving top military and political leaders, he was working on technological advancements that could potentially have a significant impact on geopolitics.
(Picture from @iwpg_la)
Today, we are living in exceptional times, where disruptive technologies such as AI, space-based technologies, quantum computing, and many others are leading to the reimagination of everything around us and transforming our lives, state interactions in the global arena, and wars. The next great industrial revolution may well be occurring over 250 miles above us in outer space and putting our world into a new perspective. This is not just a technological transformation; this is a social and human transformation.
Perhaps to a greater extent than ever since World War II, recent news has been dominated by talk of war, as well as the destructive power of AI for human existence. The headlines are of missiles and offensives in Ukraine, of possible — and catastrophic — conflict over Taiwan, and of AI as humanity’s biggest existential threat.
A critical difference between this era and earlier times of conflict is the potential role of technology for peace. Along with traditional weaponry and armaments, it is clear that new space, data, and various other information and communication technologies will play an increasingly prominent role in 21st-century conflicts, especially when combined.
Much of the discussion today focuses on the potential offensive capabilities of technology. In a recent report titled “Seven Critical Technologies for Winning the Next War”, CSIS highlighted that “the next war will be fought on a high-tech battlefield….The consequences of failure on any of these technologies are tremendous — they could make the difference between victory and defeat.”
However, in the following discussion, we shift our focus to a distinctly different aspect of technology — its potential to cultivate peace and prevent conflicts. We present seven forms of PeaceTech, which encompass technologies that can actively avert or alleviate conflicts. These technologies are part of a broader range of innovations that contribute to the greater good of society and foster the overall well-being of humanity.
The application of frontier technologies has speedy, broad, and impactful effects in building peace. From preventing military conflicts and disinformation, connecting people, facilitating dialogue, drone delivery of humanitarian aid, and solving water access conflicts, to satellite imagery to monitor human rights violations and monitor peacekeeping efforts; technology has demonstrated its strong footprint in building peace.
One important caveat is in order: readers may note the absence of data in the list below. We have chosen to include data as a cross-cutting category that applies across the seven technologies. This points to the ubiquity of data in today’s digital ecology. In an era of rapid datafication, data can no longer be classified as a single technology, but rather as an asset or toolembedded within virtually every other technology. (See our writings on the role of data for peace here)…(More)”.