Paper by Susan Ariel Aaronson: “As countries around the world expand their use of artificial intelligence (AI), the Organisation for Economic Co-operation and Development (OECD) has developed the most comprehensive website on AI policy, the OECD.AI Policy Observatory. Although the website covers public policies on AI, the author of this paper found that many governments failed to evaluate or report on their AI initiatives. This lack of reporting is a missed opportunity for policy makers to learn from their programs (the author found that less than one percent of the programs listed on the OECD.AI website had been evaluated). In addition, the author found discrepancies between what governments said they were doing on the OECD.AI website and what they reported on their own websites. In some cases, there was no evidence of government actions; in other cases, links to government sites did not work. Evaluations of AI policies are important because they help governments demonstrate how they are building trust in both AI and AI governance and that policy makers are accountable to their fellow citizens…(More)”.
Data Collaborative Case Study: NYC Recovery Data Partnership
Report by the Open Data Policy Lab (The GovLab): “In July 2020, following severe economic and social losses due to the COVID-19 pandemic, the administration of New York City Mayor Bill de Blasio announced the NYC Recovery Data Partnership. This data collaborative asked private and civic organizations with assets relevant to New York City to provide their data to the city. Senior city leaders from the First Deputy Mayor’s Office, the Mayor’s Office of Operations, Mayor’s Office of Information Privacy and Mayor’s Office of Data Analytics formed an internal coalition which served as trusted intermediaries, assessing agency requests from city agencies to use the data provided and allocating access accordingly. The data informed internal research conducted by various city agencies, including New York City Emergency Management’s Recovery Team and the NYC…(More)”
About: “Mapping Diversity is a platform for discovering key facts about diversity and representation in street names across Europe, and to spark a debate about who is missing from our urban spaces.
We looked at the names of 145,933 streets across 30 major European cities, located in 17 different countries. More than 90% of the streets named after individuals are dedicated to white men. Where did all the other inhabitants of Europe end up? The lack of diversity in toponymy speaks volumes about our past and contributes to shaping Europe’s present and future…(More)”.
Principles for effective beneficial ownership disclosure
Open Ownership: “The Open Ownership Principles (OO Principles) are a framework for considering the elements that influence whether the implementation of reforms to improve the transparency of the beneficial ownership of corporate vehicles will lead to effective beneficial ownership disclosure, that is, it generates high-quality and reliable data, maximising usability for users.
The OO Principles are intended to support governments implementing effective beneficial ownership transparency reforms and guide international institutions, civil society, and private sector actors in understanding and supporting reforms. They are a tool to identify and separate issues affecting implementation, and they provide a framework for assessing and improving existing disclosure regimes. If implemented together, the OO Principles enable disclosure systems to generate actionable and usable data across the widest range of policy applications of beneficial ownership data.
The nine principles are interdependent, but can be broadly grouped by the three main ways they improve data. The Definition, Coverage, and Detail principles enable data disclosure and collection. The Central register, Access, and Structured data principles facilitate data storage and auditability. Finally, the Verification, Up-to-date and historical records, and Sanctions and enforcement principles improve data quality and reliability….Download January 2023 version (translated versions are forthcoming)”
Whole of government innovation
Report by Geoff Mulgan: ‘Whole of government’ approaches – that aim to mobilise and align many ministries and agencies around a common challenge – have a long history. There have been notable examples during major wars, and around attempts to digitize societies, to cut energy use and to respond to the COVID-19 pandemic.
This paper has been prepared as part of a European Commission programme which I’m chairing looking at ‘whole of government innovation’ and working with national governments to help them better align their actions.
My paper – linked below – looks at the lessons of history. It outlines the many tools governments can use to achieve cross-cutting goals, linking R&D to law, regulation and procurement, and collaborating with business, universities and civil society. It argues that it is unwise to rely only on committees and boards. It shows how these choices link to innovation strategy and funding, including the relevance of half a century of experiment with moon-shots and missions.
The paper describes how the organisational challenges vary depending on the nature of the task; why governments need to avoid common technology or ‘STI trap’, of focusing only on hardware and not on social arrangements or business models; why constellations and flotillas of coordination are usually more realistic than true ‘whole of government approaches; the importance of mobilising hearts and minds as well as money and command.
Finally, it addresses the relevance of different approaches to current tasks such as the achievement of a net zero economy and society. The paper is shared as a working document – I’m keen to find new examples and approaches…(More)”.
How the Digital Transformation Changed Geopolitics
Paper by Dan Ciuriak: “In the late 2000s, a set of connected technological developments – introduction of the iPhone, deep learning through stacked neural nets, and application of GPUs to neural nets – resulted in the generation of truly astronomical amounts of data and provided the tools to exploit it. As the world emerged from the Great Financial Crisis of 2008-2009, data was decisively transformed from a mostly valueless by-product – “data exhaust” – to the “new oil”, the essential capital asset of the data-driven economy, and the “new plutonium” when deployed in social and political applications. This economy featured steep economies of scale, powerful economies of scope, network externalities in many applications, and pervasive information asymmetry. Strategic commercial policies at the firm and national levels were incentivized by the newfound scope to capture economic rents, destabilizing the rules-based system for trade and investment. At the same time, the new disruptive general-purpose technologies built on the nexus of Big Data, machine learning and artificial intelligence reconfigured geopolitical rivalry in several ways: by shifting great power rivalry onto new and critical grounds on which none had a decisive established advantage; by creating new vulnerabilities to information warfare in societies, especially open societies; and by enhancing the tools for social manipulation and the promotion of political personality cults. Machine learning, which essentially industrialized the very process of learning, drove an acceleration in the pace of innovation, which precipitated industrial policies driven by the desire to capture first mover advantage and by the fear of falling behind.
These developments provide a unifying framework to understand the progressive unravelling of the US-led global system as the decade unfolded, despite the fact that all the major innovations that drove the transition were within the US sphere and the US enjoyed first mover advantages. This is in stark contrast to the previous major economic transition to the knowledge-based economy, in which case US leadership on the key innovations extended its dominance for decades and indeed powered its rise to its unipolar moment. The world did not respond well to the changed technological and economic conditions and hence we are war: hot war, cold war, technological war, trade war, social war, and internecine political war. This paper focuses on the role of technological and economic conditions in shaping geopolitics, which is critical to understand if we are to respond to the current world disorder and to prepare to handle the coming transition in technological and economic conditions to yet another new economic era based on machine knowledge capital…(More)”.
Urban AI Guide
Guide by Popelka, S., Narvaez Zertuche, L., Beroche, H.: “The idea for this guide arose from conversations with city leaders, who were confronted with new technologies, like artificial intelligence, as a means of solving complex urban problems, but who felt they lacked the background knowledge to properly engage with and evaluate the solutions. In some instances, this knowledge gap produced a barrier to project implementation or led to unintended project outcomes.
The guide begins with a literature review, presenting the state of the art in research on urban artificial intelligence. It then diagrams and describes an “urban AI anatomy,” outlining and explaining the components that make up an urban AI system. Insights from experts in the Urban AI community enrich this section, illuminating considerations involved in each component. Finally, the guide concludes with an in-depth examination of three case studies: water meter lifecycle in Winnipeg, Canada, curb digitization and planning in Los Angeles, USA, and air quality monitoring in Vilnius, Lithuania. Collectively, the case studies highlight the diversity of ways in which artificial intelligence can be operationalized in urban contexts, as well as the steps and requirements necessary to implement an urban AI project.
Since the field of urban AI is constantly evolving, we anticipate updating the guide annually. Please consider filling out the contribution form, if you have an urban AI use case that has been operationalized. We may contact you to include the use case as a case study in a future edition of the guide.
As a continuation of the guide, we offer customized workshops on urban AI, oriented toward municipalities and other urban stakeholders, who are interested in learning more about how artificial intelligence interacts in urban environments. Please contact us if you would like more information on this program…(More)”.
Democracy Report 2023: Defiance in the Face of Autocratization
New report by Varieties of Democracy (V-Dem): “.. the largest global dataset on democracy with over 31 million data points for 202 countries from 1789 to 2022. Involving almost 4,000 scholars and other country experts, V-Dem measures hundreds of different attributes of democracy. V-Dem enables new ways to study the nature, causes, and consequences of democracy embracing its multiple meanings. THE FIRST SECTION of the report shows global levels of democ- racy sliding back and advances made over the past 35 years diminishing. Most of the drastic changes have taken place within the last ten years, while there are large regional variations in relation to the levels of democracy people experience. The second section offers analyses on the geographies and population sizes of democratizing and autocratizing countries. In the third section we focus on the countries undergoing autocratization, and on the indicators deteriorating the most, including in relation to media censorship, repression of civil society organizations, and academic freedom. While disinformation, polarization, and autocratization reinforce each other, democracies reduce the spread of disinformation. This is a sign of hope, of better times ahead. And this is precisely the message carried forward in the fourth section, where we switch our focus to examples of countries that managed to push back and where democracy resurfaces again. Scattered over the world, these success stories share common elements that may bear implications for international democracy support and protection efforts. The final section of this year’s report offers a new perspective on shifting global balances of economic and trade power as a result of autocratization…(More)”.
The Future of Compute
Independent Review by a UK Expert Panel: “…Compute is a material part of modern life. It is among the critical technologies lying behind innovation, economic growth and scientific discoveries. Compute improves our everyday lives. It underpins all the tools, services and information we hold on our handheld devices – from search engines and social media, to streaming services and accurate weather forecasts. This technology may be invisible to the public, but life today would be very different without it.
Sectors across the UK economy, both new and old, are increasingly reliant upon compute. By leveraging the capability that compute provides, businesses of all sizes can extract value from the enormous quantity of data created every day; reduce the cost and time required for research and development (R&D); improve product design; accelerate decision making processes; and increase overall efficiency. Compute also enables advancements in transformative technologies, such as AI, which themselves lead to the creation of value and innovation across the economy. This all translates into higher productivity and profitability for businesses and robust economic growth for the UK as a whole.
Compute powers modelling, simulations, data analysis and scenario planning, and thereby enables researchers to develop new drugs; find new energy sources; discover new materials; mitigate the effects of climate change; and model the spread of pandemics. Compute is required to tackle many of today’s global challenges and brings invaluable benefits to our society.
Compute’s effects on society and the economy have already been and, crucially, will continue to be transformative. The scale of compute capabilities keeps accelerating at pace. The performance of the world’s fastest compute has grown by a factor of 626 since 2010. The compute requirements of the largest machine learning models has grown 10 billion times over the last 10 years. We expect compute demand to significantly grow as compute capability continues to increase. Technology today operates very differently to 10 years ago and, in a decade’s time, it will have changed once again.
Yet, despite compute’s value to the economy and society, the UK lacks a long-term vision for compute…(More)”.
Lighthouse Reports: “Governments all over the world are experimenting with predictive algorithms in ways that are largely invisible to the public. What limited reporting there has been on this topic has largely focused on predictive policing and risk assessments in criminal justice systems. But there is an area where even more far-reaching experiments are underway on vulnerable populations with almost no scrutiny.
Fraud detection systems are widely deployed in welfare states ranging from complex machine learning models to crude spreadsheets. The scores they generate have potentially life-changing consequences for millions of people. Until now, public authorities have typically resisted calls for transparency, either by claiming that disclosure would increase the risk of fraud or to protect proprietary technology.
The sales pitch for these systems promises that they will recover millions of euros defrauded from the public purse. And the caricature of the benefit cheat is a modern take on the classic trope of the undeserving poor and much of the public debate in Europe — which has the most generous welfare states — is intensely politically charged.
The true extent of welfare fraud is routinely exaggerated by consulting firms, who are often the algorithm vendors, talking it up to near 5 percent of benefits spending while some national auditors’ offices estimate it at between 0.2 and 0.4 of spending. Distinguishing between honest mistakes and deliberate fraud in complex public systems is messy and hard.
When opaque technologies are deployed in search of political scapegoats the potential for harm among some of the poorest and most marginalised communities is significant.
Hundreds of thousands of people are being scored by these systems based on data mining operations where there has been scant public consultation. The consequences of being flagged by the “suspicion machine” can be drastic, with fraud controllers empowered to turn the lives of suspects inside out…(More)”.