Addressing trust in public sector data use


Centre for Data Ethics and Innovation: “Data sharing is fundamental to effective government and the running of public services. But it is not an end in itself. Data needs to be shared to drive improvements in service delivery and benefit citizens. For this to happen sustainably and effectively, public trust in the way data is shared and used is vital. Without such trust, the government and wider public sector risks losing society’s consent, setting back innovation as well as the smooth running of public services. Maximising the benefits of data driven technology therefore requires a solid foundation of societal approval.

AI and data driven technology offer extraordinary potential to improve decision making and service delivery in the public sector – from improved diagnostics to more efficient infrastructure and personalised public services. This makes effective use of data more important than it has ever been, and requires a step-change in the way data is shared and used. Yet sharing more data also poses risks and challenges to current governance arrangements.

The only way to build trust sustainably is to operate in a trustworthy way. Without adequate safeguards the collection and use of personal data risks changing power relationships between the citizen and the state. Insights derived by big data and the matching of different data sets can also undermine individual privacy or personal autonomy. Trade-offs are required which reflect democratic values, wider public acceptability and a shared vision of a data driven society. CDEI has a key role to play in exploring this challenge and setting out how it can be addressed. This report identifies barriers to data sharing, but focuses on building and sustaining the public trust which is vital if society is to maximise the benefits of data driven technology.

There are many areas where the sharing of anonymised and identifiable personal data by the public sector already improves services, prevents harm, and benefits the public. Over the last 20 years, different governments have adopted various measures to increase data sharing, including creating new legal sharing gateways. However, despite efforts to increase the amount of data sharing across the government, and significant successes in areas like open data, data sharing continues to be challenging and resource-intensive. This report identifies a range of technical, legal and cultural barriers that can inhibit data sharing.

Barriers to data sharing in the public sector

Technical barriers include limited adoption of common data standards and inconsistent security requirements across the public sector. Such inconsistency can prevent data sharing, or increase the cost and time for organisations to finalise data sharing agreements.

While there are often pre-existing legal gateways for data sharing, underpinned by data protection legislation, there is still a large amount of legal confusion on the part of public sector bodies wishing to share data which can cause them to start from scratch when determining legality and commit significant resources to legal advice. It is not unusual for the development of data sharing agreements to delay the projects for which the data is intended. While the legal scrutiny of data sharing arrangements is an important part of governance, improving the efficiency of these processes – without sacrificing their rigour – would allow data to be shared more quickly and at less expense.

Even when legal, the permissive nature of many legal gateways means significant cultural and organisational barriers to data sharing remain. Individual departments and agencies decide whether or not to share the data they hold and may be overly risk averse. Data sharing may not be prioritised by a department if it would require them to bear costs to deliver benefits that accrue elsewhere (i.e. to those gaining access to the data). Departments sharing data may need to invest significant resources to do so, as well as considering potential reputational or legal risks. This may hold up progress towards finding common agreement on data sharing. When there is an absence of incentives, even relatively small obstacles may mean data sharing is not deemed worthwhile by those who hold the data – despite the fact that other parts of the public sector might benefit significantly….(More)”.

German coronavirus experiment enlists help of concertgoers


Philip Oltermann at the Guardian: “German scientists are planning to equip 4,000 pop music fans with tracking gadgets and bottles of fluorescent disinfectant to get a clearer picture of how Covid-19 could be prevented from spreading at large indoor concerts.

As cultural mass gatherings across the world remain on hold for the foreseeable future, researchers in eastern Germany are recruiting volunteers for a “coronavirus experiment” with the singer-songwriter Tim Bendzko, to be held at an indoor stadium in the city of Leipzig on 22 August.

Participants, aged between 18 and 50, will wear matchstick-sized “contact tracer” devices on chains around their necks that transmit a signal at five-second intervals and collect data on each person’s movements and proximity to other members of the audience.

Inside the venue, they will also be asked to disinfect their hands with a fluorescent hand-sanitiser – designed to not just add a layer of protection but allow scientists to scour the venue with UV lights after the concerts to identify surfaces where a transmission of the virus through smear infection is most likely to take place.

Vapours from a fog machine will help visualise the possible spread of coronavirus via aerosols, which the scientists will try to predict via computer-generated models in advance of the event.

The €990,000 cost of the Restart-19 project will be shouldered between the federal states of Saxony and Saxony-Anhalt. The project’s organisers say the aim is to “identify a framework” for how larger cultural and sports events could be held “without posing a danger for the population” after 30 September….

To stop the Leipzig experiment from becoming the source of a new outbreak, signed-up volunteers will be sent a DIY test kit and have a swab at a doctor’s practice or laboratory 48 hours before the concert starts. Those who cannot show proof of a negative test at the door will be denied entry….(More)”.

Open Government Playbook


Gov.UK: “The document provides guidance and advice to help policy officials follow open government principles when carrying out their work…

The Playbook has been developed as a response to the growing demand from policymakers, communications, and digital professionals to integrate the principles of open government in their roles. The content of the Playbook was drafted using existing resources (see the ‘further reading’ section), and was consulted with open government experts from Involve and Open Government Partnership….(More)”.

Coronavirus: how the pandemic has exposed AI’s limitations


Kathy Peach at The Conversation: “It should have been artificial intelligence’s moment in the sun. With billions of dollars of investment in recent years, AI has been touted as a solution to every conceivable problem. So when the COVID-19 pandemic arrived, a multitude of AI models were immediately put to work.

Some hunted for new compounds that could be used to develop a vaccine, or attempted to improve diagnosis. Some tracked the evolution of the disease, or generated predictions for patient outcomes. Some modelled the number of cases expected given different policy choices, or tracked similarities and differences between regions.

The results, to date, have been largely disappointing. Very few of these projects have had any operational impact – hardly living up to the hype or the billions in investment. At the same time, the pandemic highlighted the fragility of many AI models. From entertainment recommendation systems to fraud detection and inventory management – the crisis has seen AI systems go awry as they struggled to adapt to sudden collective shifts in behaviour.

The unlikely hero

The unlikely hero emerging from the ashes of this pandemic is instead the crowd. Crowds of scientists around the world sharing data and insights faster than ever before. Crowds of local makers manufacturing PPE for hospitals failed by supply chains. Crowds of ordinary people organising through mutual aid groups to look after each other.

COVID-19 has reminded us of just how quickly humans can adapt existing knowledge, skills and behaviours to entirely new situations – something that highly-specialised AI systems just can’t do. At least yet….

In one of the experiments, researchers from the Istituto di Scienze e Tecnologie della Cognizione in Rome studied the use of an AI system designed to reduce social biases in collective decision-making. The AI, which held back information from the group members on what others thought early on, encouraged participants to spend more time evaluating the options by themselves.

The system succeeded in reducing the tendency of people to “follow the herd” by failing to hear diverse or minority views, or challenge assumptions – all of which are criticisms that have been levelled at the British government’s scientific advisory committees throughout the pandemic…(More)”.

How to engage with policy makers: A guide for academics in the arts and humanities


Institute for Government: “The Arts and Humanities Research Council and the Institute for Government have been working in partnership for six years on the Engaging with Government programme – a three-day course for researchers in the arts and humanities. This programme helps academics develop the knowledge and skills they need to engage effectively with government and parliamentary bodies at all levels, along with the other organisations involved in the policy-making process. We, in turn, have learned a huge amount from our participants, who now form an active alumni network brimming with expertise about how to engage with policy in practice. This guide brings together some of that learning.

Arts and humanities researchers tend to have fewer formal and established routes into government than scientists. But they can, and do, engage productively in policy making. They contribute both expertise (advice based on knowledge of a field) and evidence (facts and information) and provide new ways of framing policy debates that draw on philosophical, cultural or historical perspectives.

As this guide shows, there are steps that academics can take to improve their engagement with public policy and to make it meaningful for their research. While these activities may involve an investment of time, they offer the opportunity to make a tangible difference, and are often a source of great satisfaction and inspiration for further work.

The first part of this guide describes the landscape of policy making in the UK and some of the common ways academics can engage with it.

Part two sets out six lessons from the Engaging with Government programme, illustrated with practical examples from our alumni and speaker network. These lessons are:

  • Understand the full range of individuals and groups involved in policy making – who are the key players and who do they talk to?
  • Be aware of the political context – how does your research fit in with current thinking on the issue?
  • Communicate in ways that policy makers find useful – consider your audience and be prepared to make practical recommendations.
  • Develop and maintain networks – seek to make connections with people who share your policy interest, both in person and online.
  • Remember that you are the expert – be prepared to share your general knowledge of a subject as well as your specific research.
  • Adopt a long-term perspective – you will need to be open-minded and patient to engage successfully….(More)”.

Do nudgers need budging? A comparative analysis of European smart meter implementation


Paper by Sarah Giest: “Nudging is seen to complement or replace existing policy tools by altering people’s choice architectures towards behaviors that align with government aims, but has fallen short in meeting those targets. Crucially, governments do not nudge citizens directly, but need private agents to nudge their consumers. Based on this notion, the paper takes on an institutional approach towards nudging. Rather than looking at the relationship between nudger and nudgee, the research analyses the regulatory and market structures that affect nudge implementation by private actors, captured by the ‘budge’ idea. Focusing on the European energy policy domain, the paper analyses the contextual factors of green nudges that are initiated by Member States, and implemented by energy companies. The findings show that in the smart meter context, there are regulatory measures that affect implementation of smart meters and that government has a central role to ‘budge’, due to the dependence on private agents….(More)”.

Hammer or nudge? New brief on international policy options for COVID-19


Paper by Luc Soete: “…But over time the scientific comments given on TV and radio in my two home countries, the Netherlands and Belgium, as well as neighbouring Germany and France, became dominated by each country’s own, national virology and epidemiological experts explaining how their country’s approach to ‘flattening the curve’ and bringing down the reproduction rate was best, it became clear, even to a non-expert in the field like myself, that many of the science-based policies used to contain COVID-19 were first and foremost based on ‘hypotheses’. With the exception of Germany, not really on facts. And as Anthony Fauci, Director of the US National Institute of Allergy and Infectious Disease, probably the world’s most respected virologist once put it: “Data is real. The model is hypothesis.”

So at the risk of being an ultracrepidarian – an old word which has suddenly risen in popularity – it seemed appropriate to have a closer, more critical look at the science-based policy advice during this COVID-19 pandemic. For virologists and epidemiologists, the logical approach to a new, unknown but highly infectious virus such as SARS-CoV-2, spreading globally at pandemic speed, is ‘the hammer’: the tool to crush down quickly and radically through extreme measures (social distancing, confinement, lockdown, travel restrictions) the spread of the virus and get the transmission rate’s value as far as possible below. The stricter the confinement measures, the better.

For a social scientist and social science-based policy adviser, a hammer represents anything but a useful tool to approach society or the economy with. Her or his preference will rather go to measures, such as ‘nudges’ which alter people’s behaviour in a predictable way without coercion. Actually, the first COVID-19 measure was based on a typical ‘nudge’: improving hand hygiene among healthcare workers which was now enlarged to the whole population. ‘Nudging’ in the face of a new virus such as SARS-CoV-2 will consist of making sure incremental policy measures build up to a societal behavioural change, starting from hand hygiene, social distancing, to confinement and various forms of lockdown. It will be crucial to measure the additional, marginal impact of each measure in its contribution to the overall reduction in the transmission of the virus. Introducing all measures at once, as in the case of the ‘hammer’ strategy, subsequently provides little useful information on the effectiveness of each measure ( on the contrary, in fact). In a period of deconfinement, one now has little information on which measures are likely to be the most effective. From a nudge perspective, achieving a change in social behaviour with respect to physical distancing: the so-called one-and-a-half metre society, will be an essential variable and measuring its impact on the spreading of the virus crucial. One of the reasons is that full adoption of such physical distancing automatically and without the need of coercion, will prevent the occurrence of large or smaller social gatherings without authorities having to specify the rules. This is implicit in the principle of nudging: it will be the providers, the entrepreneurs of personal service sectors who will have to come up with organisational innovations enabling physical distancing in the safe delivery of such services.

Most noteworthy, however, is the purely national setting within which most virology and epidemiological science-based policy advice is currently framed. This contrasts sharply with the actual scientific research in the field which is today purely global, based on shared data and open access. For years now, epidemiological studies have taken individual countries as ‘containers’ for data collection and data analysis. It is also the national setting that provides the framework for estimating the capacity of medical facilities, especially the total number of available intensive care units needed to handle COVID-19 patients in each country. In the case of Europe and as a result, it has led to the reintroduction of internal borders which had ‘disappeared’ 25 years ago for fear of cross-border contamination. Doing so, COVID-19 has undermined the notion of European values. This policy brief is my attempt to clarify the situation….(More)”.

Mapping Mobility Functional Areas (MFA) using Mobile Positioning Data to Inform COVID-19 Policies


EU Science Hub: “This work introduces the concept of data-driven Mobility Functional Areas (MFAs) as geographic zones with a high degree of intra-mobility exchanges. Such information, calculated at European regional scale thanks to mobile data, can be useful to inform targeted re-escalation policy responses in cases of future COVID-19 outbreaks (avoiding large-area or even national lockdowns). In such events, the geographic distribution of MFAs would define territorial areas to which lockdown interventions could be limited, with the result of minimizing socio-economic consequences of such policies. The analysis of the time evolution of MFAs can also be thought of as a measure of how human mobility changes not only in intensity but also in patterns, providing innovative insights into the impact of mobility containment measures. This work presents a first analysis for 15 European countries (14 EU Member States and Norway)….(More)”.

Ethical and societal implications of algorithms, data, and artificial intelligence: a roadmap for research


Report by the Nuffield Foundation and the Leverhulme Centre for the Future of Intelligence:” The aim of this report is to offer a broad roadmap for work on the ethical and societal implications of algorithms, data, and AI (ADA) in the coming years. It is aimed at those involved in planning, funding, and pursuing research and policy work related to these technologies. We use the term ‘ADA-based technologies’ to capture a broad range of ethically and societally relevant technologies based on algorithms, data, and AI, recognising that these three concepts are not totally separable from one another and will often overlap. A shared set of key concepts and concerns is emerging, with widespread agreement on some of the core issues (such as bias) and values (such as fairness) that an ethics of algorithms, data, and AI should focus on. Over the last two years, these have begun to be codified in various codes and sets of ‘principles’. Agreeing on these issues, values and high-level principles is an important step for ensuring that ADA-based technologies are developed and used for the benefit of society. However, we see three main gaps in this existing work: (i) a lack of clarity or consensus around the meaning of central ethical concepts and how they apply in specific situations; (ii) insufficient attention given to tensions between ideals and values; (iii) insufficient evidence on both (a) key technological capabilities and impacts, and (b) the perspectives of different publics.”….(More)”.

Data is Dangerous: Comparing the Risks that the United States, Canada and Germany See in Data Troves


Paper by Susan Ariel Aaronson: “Data and national security have a complex relationship. Data is essential to national defense — to understanding and countering adversaries. Data underpins many modern military tools from drones to artificial intelligence. Moreover, to protect their citizens, governments collect lots of data about their constituents. Those same datasets are vulnerable to theft, hacking, and misuse. In 2013, the Department of Defense’s research arm (DARPA) funded a study examining if “ the availability of data provide a determined adversary with the tools necessary to inflict nation-state level damage. The results were not made public. Given the risks to the data of their citizens, defense officials should be vociferous advocates for interoperable data protection rules.

This policy brief uses case studies to show that inadequate governance of personal data can also undermine national security. The case studies represent diverse internet sectors affecting netizens differently. I do not address malware or disinformation, which are also issues of data governance, but have already been widely researched by other scholars. I illuminate how policymakers, technologists, and the public are/were unprepared for how inadequate governance spillovers affected national security. I then makes some specific recommendations about what we can do about this problem….(More)”.