Ukrainians Are Using an App to Return Home


Article by Yuliya Panfil and Allison Price: “Two years into Russia’s invasion of Ukraine, the human toll continues to mount. At least 11 million people have been displaced by heavy bombing, drone strikes, and combat, and well over a million homes have been damaged or destroyed. But just miles from the front lines of what is a conventional land invasion, something decidedly unconventional has been deployed to help restore Ukrainian communities.

Thousands of families whose homes have been hit by Russian shelling are using their smartphones to file compensation claims, access government funds, and begin to rebuild their homes. This innovation is part of eRecovery, the world’s first-ever example of a government compensation program for damaged or destroyed homes rolled out digitally, at scale, in the midst of a war. It’s one of the ways in which Ukraine’s tech-savvy government and populace have leaned into digital solutions to help counter Russian aggression with resilience and a speedier approach to reconstruction and recovery.

According to Ukraine’s Housing, Land and Property Technical Working Group, since its launch last summer eRecovery has processed more than 83,000 compensation claims for damaged or destroyed property and paid out more than 45,000. In addition, more than half a million Ukrainians have taken the first step in the compensation process by filing a property damage report through Ukraine’s e-government platform, Diia. eRecovery’s potential to transform the way governments get people back into their homes following a war, natural disaster, or other calamity is hard to overstate…(More)”.

Unconventional data, unprecedented insights: leveraging non-traditional data during a pandemic


Paper by Kaylin Bolt et al: “The COVID-19 pandemic prompted new interest in non-traditional data sources to inform response efforts and mitigate knowledge gaps. While non-traditional data offers some advantages over traditional data, it also raises concerns related to biases, representativity, informed consent and security vulnerabilities. This study focuses on three specific types of non-traditional data: mobility, social media, and participatory surveillance platform data. Qualitative results are presented on the successes, challenges, and recommendations of key informants who used these non-traditional data sources during the COVID-19 pandemic in Spain and Italy….

Non-traditional data proved valuable in providing rapid results and filling data gaps, especially when traditional data faced delays. Increased data access and innovative collaborative efforts across sectors facilitated its use. Challenges included unreliable access and data quality concerns, particularly the lack of comprehensive demographic and geographic information. To further leverage non-traditional data, participants recommended prioritizing data governance, establishing data brokers, and sustaining multi-institutional collaborations. The value of non-traditional data was perceived as underutilized in public health surveillance, program evaluation and policymaking. Participants saw opportunities to integrate them into public health systems with the necessary investments in data pipelines, infrastructure, and technical capacity…(More)”.

Public sector capacity matters, but what is it?


Blog by Rainer Kattel, Marriana Mazzucato, Rosie Collington, Fernando Fernandez-Monge, Iacopo Gronchi, Ruth Puttick: “As governments turn increasingly to public sector innovations, challenges, missions and transformative policy initiatives, the need to understand and develop public sector capacities is ever more important. In IIPP’s project with Bloomberg Philanthropies to develop a Public Sector Capabilities Index, we propose to define public sector capacities through three inter-connected layers: state capacities, organisational capabilities, and dynamic capabilities of the public organisations.

The idea that governments should be able to design and deliver effective policies has existed ever since we had governments. A quick search in Google’s Ngram viewer shows that the use of state capacity in published books has experienced exponential growth since the late 1980s. It is, however, not a coincidence that focus on state and public sector capacities more broadly emerges in the shadow of new public management and neoliberal governance and policy reforms. Rather than understanding governance as a collaborative effort between all sectors, these reforms gave normative preference to business practices. Increasing focus on public sector capacity as a concept should thus be understood as an attempt to rebalance our understanding of how change happens in societies — through cross-sectoral co-creation — and as an effort to build the muscles in public organisations to work together to tackle socio-economic challenges.

We propose to define public sector capacities through three inter-connected layers: state capacities, organizational routines, and dynamic capabilities of the public organisations…(More)”.

How will AI shape our future cities?


Article by Ying Zhang: “For city planners, a bird’s-eye view of a map showing buildings and streets is no longer enough. They need to simulate changes to bus routes or traffic light timings before implementation to know how they might affect the population. Now, they can do so with digital twins – often referred to as a “mirror world” – which allows them to simulate scenarios more safely and cost-effectively through a three-dimensional virtual replica.

Cities such as New York, Shanghai and Helsinki are already using digital twins. In 2022, the city of Zurich launched its own version. Anyone can use it to measure the height of buildings, determine the shadows they cast and take a look into the future to see how Switzerland’s largest city might develop. Traffic congestion, a housing shortage and higher energy demands are becoming pressing issues in Switzerland, where 74% of the population already lives in urban areas.

But updating and managing digital twins will become more complex as population densities and the levels of detail increase, according to architect and urban designer Aurel von Richthofen of the consultancy Arup.

The world’s current urban planning models are like “individual silos” where “data cannot be shared, which makes urban planning not as efficient as we expect it to be”, said von Richthofen at a recent event hosted by the Swiss innovation network Swissnex. …

The underlying data is key to whether a digital twin city is effective. But getting access to quality data from different organisations is extremely difficult. Sensors, drones and mobile devices may collect data in real-time. But they tend to be organised around different knowledge domains – such as land use, building control, transport or ecology – each with its own data collection culture and physical models…(More)”

The Radical How


Report by Public Digital: “…We believe in the old adage about making the most of a crisis. We think the constraints facing the next government provide an unmissable opportunity to change how government works for the better.

Any mission-focused government should be well equipped to define, from day one, what outcomes it wants to bring about.

But radically changing what the government does is only part of the challenge. We also need to change how government does things. The usual methods, we argue in this paper, are too prone to failure and delay.

There’s a different approach to public service organisation, one based on multidisciplinary teams, starting with citizen needs, and scaling iteratively by testing assumptions. We’ve been arguing in favour of it for years now, and the more it gets used, the more we see success and timely delivery.

We think taking a new approach makes it possible to shift government from an organisation of programmes and projects, to one of missions and services. It offers even constrained administrations an opportunity to improve their chances of delivering outcomes, reducing risk, saving money, and rebuilding public trust…(More)”.

i.AI Consultation Analyser


New Tool by AI.Gov.UK: “Public consultations are a critical part of the process of making laws, but analysing consultation responses is complex and very time consuming. Working with the No10 data science team (10DS), the Incubator for Artificial Intelligence (i.AI) is developing a tool to make the process of analysing public responses to government consultations faster and fairer.

The Analyser uses AI and data science techniques to automatically extract patterns and themes from the responses, and turns them into dashboards for policy makers.

The goal is for computers to do what they are best at: finding patterns and analysing large amounts of data. That means humans are free to do the work of understanding those patterns.

Screenshot showing donut chart for those who agree or disagree, and a bar chart showing popularity of prevalent themes

Government runs 700-800 consultations a year on matters of importance to the public. Some are very small, but a large consultation might attract hundreds of thousands of written responses.

A consultation attracting 30,000 responses requires a team of around 25 analysts for 3 months to analyse the data and write the report. And it’s not unheard of to get double that number

If we can apply automation in a way that is fair, effective and accountable, we could save most of that £80m…(More)”

Participatory democracy in the EU should be strengthened with a Standing Citizens’ Assembly


Article by James Mackay and Kalypso Nicolaïdis: “EU citizens have multiple participatory instruments at their disposal, from the right to petition the European Parliament (EP) to the European Citizen’s Initiative (ECI), from the European Commission’s public online consultation and Citizens’ Dialogues to the role of the European Ombudsman as an advocate for the public vis-à-vis the EU institutions.

While these mechanisms are broadly welcome they have – unfortunately – remained too timid and largely ineffective in bolstering bottom-up participation. They tend to involve experts and organised interest groups rather than ordinary citizens. They don’t encourage debates on non-experts’ policy preferences and are executed too often at the discretion of the political elites to  justify pre-existing policy decisions.

In short, they feel more like consultative mechanisms than significant democratic innovations. That’s why the EU should be bold and demonstrate its democratic leadership by institutionalising its newly-created Citizens’ Panels into a Standing Citizens’ Assembly with rotating membership chosen by lot and renewed on a regular basis…(More)”.

Are Evidence-Based Medicine and Public Health Incompatible?


Essay by Michael Schulson: “It’s a familiar pandemic story: In September 2020, Angela McLean and John Edmunds found themselves sitting in the same Zoom meeting, listening to a discussion they didn’t like.

At some point during the meeting, McLean — professor of mathematical biology at the Oxford University, dame commander of the Order of the British Empire, fellow of the Royal Society of London, and then-chief scientific adviser to the United Kingdom’s Ministry of Defense — sent Edmunds a message on WhatsApp.

“Who is this fuckwitt?” she asked.

The message was evidently referring to Carl Heneghan, director of the Center for Evidence-Based Medicine at Oxford. He was on Zoom that day, along with McLean and Edmunds and two other experts, to advise the British prime minister on the Covid-19 pandemic.

Their disagreement — recently made public as part of a British government inquiry into the Covid-19 response — is one small chapter in a long-running clash between two schools of thought within the world of health care.

McLean and Edmunds are experts in infectious disease modeling; they build elaborate simulations of pandemics, which they use to predict how infections will spread and how best to slow them down. Often, during the Covid-19 pandemic, such models were used alongside other forms of evidence to urge more restrictions to slow the spread of the disease. Heneghan, meanwhile, is a prominent figure in the world of evidence-based medicine, or EBM. The movement aims to help doctors draw on the best available evidence when making decisions and advising patients. Over the past 30 years, EBM has transformed the practice of medicine worldwide.

Whether it can transform the practice of public health — which focuses not on individuals, but on keeping the broader community healthy — is a thornier question…(More)”.

Digitalisation and citizen engagement: comparing participatory budgeting in Rome and Barcelona


Book chapter by Giorgia Mattei, Valentina Santolamazza and Martina Manzo: “The digitalisation of participatory budgeting (PB) is an increasing phenomenon in that digital tools could help achieve greater citizen engagement. However, comparing two similar cases – i.e. Rome and Barcelona – some differences appear during the integration of digital tools into the PB processes. The present study describes how digital tools have positively influenced PB throughout different phases, making communication more transparent, involving a wider audience, empowering people and, consequently, making citizens’ engagement more effective. Nevertheless, the research dwells on the different elements adopted to overcome the digitalisation limits and shows various approaches and results…(More)”.

Data, Privacy Laws and Firm Production: Evidence from the GDPR


Paper by Mert Demirer, Diego J. Jiménez Hernández, Dean Li & Sida Peng: “By regulating how firms collect, store, and use data, privacy laws may change the role of data in production and alter firm demand for information technology inputs. We study how firms respond to privacy laws in the context of the EU’s General Data Protection Regulation (GDPR) by using seven years of data from a large global cloud-computing provider. Our difference-in-difference estimates indicate that, in response to the GDPR, EU firms decreased data storage by 26% and data processing by 15% relative to comparable US firms, becoming less “data-intensive.” To estimate the costs of the GDPR for firms, we propose and estimate a production function where data and computation serve as inputs to the production of “information.” We find that data and computation are strong complements in production and that firm responses are consistent with the GDPR, representing a 20% increase in the cost of data on average. Variation in the firm-level effects of the GDPR and industry-level exposure to data, however, drives significant heterogeneity in our estimates of the impact of the GDPR on production costs…(More)”