Creating a digital commons


Report by the IPPR (UK): ” There are, today, almost no parts of life that are untouched by the presence of data. Virtually every action we take produces some form of digital trail – our phones track our locations, our browsers track searches, our social network apps log our friends and family – even when we are only dimly aware of it.

It is the combination of this near-ubiquitous gathering of data with fast processing that has generated the economic and social transformation of the last few years – one that, if current developments in artificial intelligence (AI) continue, is only likely to accelerate. Combined with data-enabled technology, from the internet of things to 3D printing, we are potentially on the cusp of a radically different economy and society.

As the world emerges from the first phase of the pandemic, the demands for a socially just and sustainable recovery have grown. The data economy can and should be an essential part of that reconstruction, from the efficient management of energy systems to providing greater flexibility in working time. However, without effective public policy, and democratic oversight and management, the danger is that the tendencies in the data economy that we have already seen towards monopoly and opacity – reinforced, so far, by the crisis – will continue to dominate. It is essential, then, that planning for a fairer, more sustainable economy in the future build in active public policy for data…

This report focusses closely on data as the fundamental building block of the emerging economy, and argues that its use, management, ownership, and control as critical to shaping the future…(More)”.

Data for Policy: Junk-Food Diet or Technological Frontier?


Blog by Ed Humpherson at Data & Policy: “At the Office for Statistics Regulation, thinking about these questions is our day job. We set the standards for Government statistics and data through our Code of Practice for Statistics. And we review how Government departments are living up to these standards when they publish data and statistics. We routinely look at Government statistics are used in public debate.

Based on this, I would propose four factors that ensure that new data sources and tools serve the public good. They do so when:

  1. When data quality is properly tested and understood:

As my colleague Penny Babb wrote recently in a blog“‘Don’t trust the data. If you’ve found something interesting, something has probably gone wrong!”. People who work routinely with data develop a sort of innate scepticism, which Penny’s blog captures neatly. Understanding the limitations of both the data, and the inferences you make about the data, are the starting point for any appropriate role for data and policy. Accepting results and insights from new data at face value is a mistake. Much better to test the quality, explore the risks of mistakes, and only then to share findings and conclusions.

2. When the risks of misleadingness are considered:

At OSR, we have an approach to misleadingness that focuses on whether a misuse of data might lead a listener to a wrong conclusion. In fact, by “wrong” we don’t mean in some absolute sense of objective truth; more that if they received the data presented in a different and more faithful way, they would change their mind. Here’s a really simple example: someone might hear that, of two neighbouring countries, one has a much lower fatality rate, when comparing deaths to positive tests for Covid-19. …

3. When the data fill gaps

Data gaps come in several forms. One gap, highlighted by the interest in real-time economic indicators, is timing. Economic statistics don’t really tell us what’s going on right now. Figures like GDP, trade and inflation tells us about some point in the (admittedly quite) recent past. This is the attraction of the real-time economic indicators, which the Bank of England have drawn on in their decisions during the pandemic. They give policymakers a much more real-time feel by filling in this timing gap.

Other gaps are not about time but about coverage….

4. When the data are available

Perhaps the most important thing for data and policy is to democratise the notion of who the data are for. Data (and policy itself) are not just for decision-making elites. They are a tool to help people make sense of their world, what is going on in their community, helping frame and guide the choices they make.

For this reason, I often instinctively recoil at narratives of data that focus on the usefulness of data to decision-makers. Of course, we are all decision-makers of one kind or another, and data can help us all. But I always suspect that the “data for decision-makers” narrative harbours an assumption that decisions are made by senior, central, expert people, who make decisions on behalf of society; people who are, in the words of the musical Hamilton, in the room where it happens. It’s this implication that I find uncomfortable.

That’s why, during the pandemic, our work at the Office for Statistics Regulation has repeatedly argued that data should be made available. We have published a statement that any management information referred to by a decision maker should be published clearly and openly. We call this equality of access.

We fight for equality of access. We have secured the publication of lots of data — on positive Covid-19 cases in England’s Local Authorities, on Covid-19 in prisons, on antibody testing in Scotland…. and several others.

Data and policy are a powerful mix. They offer huge benefits to society in terms of defining, understanding and solving problems, and thereby in improving lives. We should be pleased that the coming together of data and policy is being sped-up by the pandemic.

But to secure these benefits, we need to focus on four things: quality, misleadingness, gaps, and public availability….(More)”

Public perceptions on data sharing: key insights from the UK and the USA


Paper by Saira Ghafur, Jackie Van Dael, Melanie Leis and Ara Darzi, and Aziz Sheikh: “Data science and artificial intelligence (AI) have the potential to transform the delivery of health care. Health care as a sector, with all of the longitudinal data it holds on patients across their lifetimes, is positioned to take advantage of what data science and AI have to offer. The current COVID-19 pandemic has shown the benefits of sharing data globally to permit a data-driven response through rapid data collection, analysis, modelling, and timely reporting.

Despite its obvious advantages, data sharing is a controversial subject, with researchers and members of the public justifiably concerned about how and why health data are shared. The most common concern is privacy; even when data are (pseudo-)anonymised, there remains a risk that a malicious hacker could, using only a few datapoints, re-identify individuals. For many, it is often unclear whether the risks of data sharing outweigh the benefits.

A series of surveys over recent years indicate that the public holds a range of views about data sharing. Over the past few years, there have been several important data breaches and cyberattacks. This has resulted in patients and the public questioning the safety of their data, including the prospect or risk of their health data being shared with unauthorised third parties.

We surveyed people across the UK and the USA to examine public attitude towards data sharing, data access, and the use of AI in health care. These two countries were chosen as comparators as both are high-income countries that have had substantial national investments in health information technology (IT) with established track records of using data to support health-care planning, delivery, and research. The UK and USA, however, have sharply contrasting models of health-care delivery, making it interesting to observe if these differences affect public attitudes.

Willingness to share anonymised personal health information varied across receiving bodies (figure). The more commercial the purpose of the receiving institution (eg, for an insurance or tech company), the less often respondents were willing to share their anonymised personal health information in both the UK and the USA. Older respondents (≥35 years) in both countries were generally less likely to trust any organisation with their anonymised personal health information than younger respondents (<35 years)…

Despite the benefits of big data and technology in health care, our findings suggest that the rapid development of novel technologies has been received with concern. Growing commodification of patient data has increased awareness of the risks involved in data sharing. There is a need for public standards that secure regulation and transparency of data use and sharing and support patient understanding of how data are used and for what purposes….(More)”.

20’s the limit: How to encourage speed reductions


Report by The Wales Centre for Public Policy: “This report has been prepared to support the Welsh Government’s plan to introduce a 20mph national default speed limit in 2022. It aims to address two main questions: 1) What specific behavioural interventions might be implemented to promote driver compliance with 20mph speed limits in residential areas; and 2) are there particular demographics, community characteristics or other features that should form the basis of a segmentation approach?

The reasons for speeding are complex, but many behaviour change
techniques have been successfully applied to road safety, including some which use behavioural insights or “nudges”.
Drivers can be segmented into three types: defiers (a small minority),
conformers (the majority) and champions (a minority). Conformers are law abiding citizens who respect social norms – getting this group to comply can achieve a tipping point.
Other sectors have shown that providing information is only effective if part of a wider package of measures and that people are most open to
change at times of disruption or learning (e.g. learner drivers)….(More)”.

Rethinking citizen engagement for an inclusive energy transition


Urban Futures Studio: “In July 2020, we published our new essay ‘What, How and Who? Designing inclusive interactions in the energy transition’ (Bronsvoort, Hoffman and Hajer, 2020). In this essay, we argue that how the interactions between citizens and governments are shaped and enacted, has a large influence on who gets involved and to what extend people feel heard. To apply this approach to cases, we distinguish between three dimensions of interaction:

  • What (the defined object or issue at hand)
  • How (the setting and staging of the interaction)
  • Who (the target groups and protagonists of the process)

Focusing on the issue of form, we argue that processes for interaction between citizens and governments should be designed in a way that is more future oriented, organized over the long term, in closer proximity to citizens and with attention to the powerful role of ‘in-betweeners’ and ‘in-between’ places such as community houses, where people can meet to deliberate on the wide range of possible futures for their neighbourhood. 

Towards a multiplicity of future visions for sustainable cities
The energy transition has major consequences for the way we live, work, move and consume. For such complex transitions, governments need to engage and collaborate with citizens and other stakeholders. Their engagement enriches existing visions on future neighbourhoods, inform local policies and stimulate change. But how do you shape and organize such a participatory process? While governments use a wide range of public participation methods, many researchers have emphasized the limitations of many of these conventional methods with regard to the inclusion of diverse groups of citizens and in bridging discrepancies between government approaches and people’s lived experiences.

Rethinking citizen engagement for an inclusive energy transition
To help rethink citizen engagement, the Urban Futures Studio investigates existing and new approaches to citizen engagement and how they are practised by governments and societal actors. Following our essay research, our next project on citizen engagement includes a study on its relation to experimentation as a novel mode of governance. The goal of this research is to show insights into how citizen engagement manifests itself in the context of experimental governance on the neighbourhood level. By investigating the interactions between citizens, governments and other stakeholders in different types of participatory projects, we aim to gain a better understanding of how citizens are engaged and included in energy transition experiments and how we can improve its level of inclusion.

We use a relational approach of citizen engagement, by which we view participatory processes as collective practices that both shape and are shaped by their ‘matter of concern’, their public and their setting and staging. This view places emphasis on the form and conditions under which the interaction takes place. For example, the initiative of Places of Hope showed that engagement can be organised in diverse ways and can create new collectives….(More)”.

Four Principles for Integrating AI & Good Governance


Oxford Commission on AI and Good Governance: “Many governments, public agencies and institutions already employ AI in providing public services, the distribution of resources and the delivery of governance goods. In the public sector, AI-enabled governance may afford new efficiencies that have the potential to transform a wide array of public service tasks.
But short-sighted design and use of AI can create new problems, entrench existing inequalities, and calcify and ultimately undermine government organizations.

Frameworks for the procurement and implementation of AI in public service have widely remained undeveloped. Frequently, existing regulations and national laws are no longer fit for purpose to ensure
good behaviour (of either AI or private suppliers) and are ill-equipped to provide guidance on the democratic use of AI.
As technology evolves rapidly, we need rules to guide the use of AI in ways that safeguard democratic values. Under what conditions can AI be put into service for good governance?

We offer a framework for integrating AI with good governance. We believe that with dedicated attention and evidence-based policy research, it should be possible to overcome the combined technical and organizational challenges of successfully integrating AI with good governance. Doing so requires working towards:


Inclusive Design: issues around discrimination and bias of AI in relation to inadequate data sets, exclusion of minorities and under-represented
groups, and the lack of diversity in design.
Informed Procurement: issues around the acquisition and development in relation to due diligence, design and usability specifications and the assessment of risks and benefits.
Purposeful Implementation: issues around the use of AI in relation to interoperability, training needs for public servants, and integration with decision-making processes.
Persistent Accountability: issues around the accountability and transparency of AI in relation to ‘black box’ algorithms, the interpretability and explainability of systems, monitoring and auditing…(More)”

Addressing trust in public sector data use


Centre for Data Ethics and Innovation: “Data sharing is fundamental to effective government and the running of public services. But it is not an end in itself. Data needs to be shared to drive improvements in service delivery and benefit citizens. For this to happen sustainably and effectively, public trust in the way data is shared and used is vital. Without such trust, the government and wider public sector risks losing society’s consent, setting back innovation as well as the smooth running of public services. Maximising the benefits of data driven technology therefore requires a solid foundation of societal approval.

AI and data driven technology offer extraordinary potential to improve decision making and service delivery in the public sector – from improved diagnostics to more efficient infrastructure and personalised public services. This makes effective use of data more important than it has ever been, and requires a step-change in the way data is shared and used. Yet sharing more data also poses risks and challenges to current governance arrangements.

The only way to build trust sustainably is to operate in a trustworthy way. Without adequate safeguards the collection and use of personal data risks changing power relationships between the citizen and the state. Insights derived by big data and the matching of different data sets can also undermine individual privacy or personal autonomy. Trade-offs are required which reflect democratic values, wider public acceptability and a shared vision of a data driven society. CDEI has a key role to play in exploring this challenge and setting out how it can be addressed. This report identifies barriers to data sharing, but focuses on building and sustaining the public trust which is vital if society is to maximise the benefits of data driven technology.

There are many areas where the sharing of anonymised and identifiable personal data by the public sector already improves services, prevents harm, and benefits the public. Over the last 20 years, different governments have adopted various measures to increase data sharing, including creating new legal sharing gateways. However, despite efforts to increase the amount of data sharing across the government, and significant successes in areas like open data, data sharing continues to be challenging and resource-intensive. This report identifies a range of technical, legal and cultural barriers that can inhibit data sharing.

Barriers to data sharing in the public sector

Technical barriers include limited adoption of common data standards and inconsistent security requirements across the public sector. Such inconsistency can prevent data sharing, or increase the cost and time for organisations to finalise data sharing agreements.

While there are often pre-existing legal gateways for data sharing, underpinned by data protection legislation, there is still a large amount of legal confusion on the part of public sector bodies wishing to share data which can cause them to start from scratch when determining legality and commit significant resources to legal advice. It is not unusual for the development of data sharing agreements to delay the projects for which the data is intended. While the legal scrutiny of data sharing arrangements is an important part of governance, improving the efficiency of these processes – without sacrificing their rigour – would allow data to be shared more quickly and at less expense.

Even when legal, the permissive nature of many legal gateways means significant cultural and organisational barriers to data sharing remain. Individual departments and agencies decide whether or not to share the data they hold and may be overly risk averse. Data sharing may not be prioritised by a department if it would require them to bear costs to deliver benefits that accrue elsewhere (i.e. to those gaining access to the data). Departments sharing data may need to invest significant resources to do so, as well as considering potential reputational or legal risks. This may hold up progress towards finding common agreement on data sharing. When there is an absence of incentives, even relatively small obstacles may mean data sharing is not deemed worthwhile by those who hold the data – despite the fact that other parts of the public sector might benefit significantly….(More)”.

German coronavirus experiment enlists help of concertgoers


Philip Oltermann at the Guardian: “German scientists are planning to equip 4,000 pop music fans with tracking gadgets and bottles of fluorescent disinfectant to get a clearer picture of how Covid-19 could be prevented from spreading at large indoor concerts.

As cultural mass gatherings across the world remain on hold for the foreseeable future, researchers in eastern Germany are recruiting volunteers for a “coronavirus experiment” with the singer-songwriter Tim Bendzko, to be held at an indoor stadium in the city of Leipzig on 22 August.

Participants, aged between 18 and 50, will wear matchstick-sized “contact tracer” devices on chains around their necks that transmit a signal at five-second intervals and collect data on each person’s movements and proximity to other members of the audience.

Inside the venue, they will also be asked to disinfect their hands with a fluorescent hand-sanitiser – designed to not just add a layer of protection but allow scientists to scour the venue with UV lights after the concerts to identify surfaces where a transmission of the virus through smear infection is most likely to take place.

Vapours from a fog machine will help visualise the possible spread of coronavirus via aerosols, which the scientists will try to predict via computer-generated models in advance of the event.

The €990,000 cost of the Restart-19 project will be shouldered between the federal states of Saxony and Saxony-Anhalt. The project’s organisers say the aim is to “identify a framework” for how larger cultural and sports events could be held “without posing a danger for the population” after 30 September….

To stop the Leipzig experiment from becoming the source of a new outbreak, signed-up volunteers will be sent a DIY test kit and have a swab at a doctor’s practice or laboratory 48 hours before the concert starts. Those who cannot show proof of a negative test at the door will be denied entry….(More)”.

Open Government Playbook


Gov.UK: “The document provides guidance and advice to help policy officials follow open government principles when carrying out their work…

The Playbook has been developed as a response to the growing demand from policymakers, communications, and digital professionals to integrate the principles of open government in their roles. The content of the Playbook was drafted using existing resources (see the ‘further reading’ section), and was consulted with open government experts from Involve and Open Government Partnership….(More)”.

Coronavirus: how the pandemic has exposed AI’s limitations


Kathy Peach at The Conversation: “It should have been artificial intelligence’s moment in the sun. With billions of dollars of investment in recent years, AI has been touted as a solution to every conceivable problem. So when the COVID-19 pandemic arrived, a multitude of AI models were immediately put to work.

Some hunted for new compounds that could be used to develop a vaccine, or attempted to improve diagnosis. Some tracked the evolution of the disease, or generated predictions for patient outcomes. Some modelled the number of cases expected given different policy choices, or tracked similarities and differences between regions.

The results, to date, have been largely disappointing. Very few of these projects have had any operational impact – hardly living up to the hype or the billions in investment. At the same time, the pandemic highlighted the fragility of many AI models. From entertainment recommendation systems to fraud detection and inventory management – the crisis has seen AI systems go awry as they struggled to adapt to sudden collective shifts in behaviour.

The unlikely hero

The unlikely hero emerging from the ashes of this pandemic is instead the crowd. Crowds of scientists around the world sharing data and insights faster than ever before. Crowds of local makers manufacturing PPE for hospitals failed by supply chains. Crowds of ordinary people organising through mutual aid groups to look after each other.

COVID-19 has reminded us of just how quickly humans can adapt existing knowledge, skills and behaviours to entirely new situations – something that highly-specialised AI systems just can’t do. At least yet….

In one of the experiments, researchers from the Istituto di Scienze e Tecnologie della Cognizione in Rome studied the use of an AI system designed to reduce social biases in collective decision-making. The AI, which held back information from the group members on what others thought early on, encouraged participants to spend more time evaluating the options by themselves.

The system succeeded in reducing the tendency of people to “follow the herd” by failing to hear diverse or minority views, or challenge assumptions – all of which are criticisms that have been levelled at the British government’s scientific advisory committees throughout the pandemic…(More)”.