UK response to pandemic hampered by poor data practices


Report for the Royal Society: “The UK is well behind other countries in making use of data to have a real time understanding of the spread and economic impact of the pandemic according to Data Evaluation and Learning for Viral Epidemics (DELVE), a multi-disciplinary group convened by the Royal Society.

The report, Data Readiness: Lessons from an Emergency, highlights how data such as aggregated and anonymised mobility and payment transaction data, already gathered by companies, could be used to give a more accurate picture of the pandemic at national and local levels.  That could in turn lead to improvements in evaluation and better targeting of interventions.

Maximising the value of big data at a time of crisis requires careful cooperation across the private sector, that is already gathering these data, the public sector, which can provide a base for aggregating and overseeing the correct use of the data and researchers who have the skills to analyse it for the public good.  This work needs to be developed in accordance with data protection legislation and respect people’s concerns about data security and privacy.

The report calls on the Government to extend the powers of the Office for National Statistics to enable them to support trustworthy access to ‘happenstance’ data – data that are already gathered but not for a specific public health purpose – and for the Government to fund pathfinder projects that focus on specific policy questions such as how we nowcast economic metrics and how we better understand population movements.

Neil Lawrence, DeepMind Professor of Machine Learning at the University of Cambridge, Senior AI Fellow at The Alan Turing Institute and an author of the report, said: “The UK has talked about making better use of data for the public good, but we have had statements of good intent, rather than action.  We need to plan better for national emergencies. We need to look at the National Risk Register through the lens of what data would help us to respond more effectively. We have to learn our lessons from experiences in this pandemic and be better prepared for future crises.  That means doing the work now to ensure that companies, the public sector and researchers have pathfinder projects up and running to share and analyse data and help the government to make better informed decisions.”  

During the pandemic, counts of the daily flow of people from one place to another between more than 3000 districts in Spain have been available at the click of a button, allowing policy makers to more effectively understand how the movement of people contributes to the spread of the virus. This was based on a collaboration between the country’s three main mobile phone operators.  In France, measuring the impact of the pandemic on consumer spending on a daily and weekly scale was possible as a result of coordinated cooperation between the country’s national interbank network. 

Professor Lawrence added: “Mobile phone companies might provide a huge amount of anonymised and aggregated data that would allow us a much greater understanding of how people move around, potentially spreading the virus as they go.  And there is a wealth of other data, such as from transport systems. The more we understand about this pandemic, the better we can tackle it. We should be able to work together, the private and the public sectors, to harness big data for massive positive social good and do that safely and responsibly.”…(More)”

Review into bias in algorithmic decision-making


Report by the Center for Data Ethics and Innovation (CDEI) (UK): “Unfair biases, whether conscious or unconscious, can be a problem in many decision-making processes. This review considers the impact that an increasing use of algorithmic tools is having on bias in decision-making, the steps that are required to manage risks, and the opportunities that better use of data offers to enhance fairness. We have focused on the use of
algorithms in significant decisions about individuals, looking across four sectors (recruitment, financial services, policing and local government), and making cross-cutting recommendations that aim to help build the right systems so that algorithms improve, rather than worsen, decision-making…(More)”.

Commission proposes measures to boost data sharing and support European data spaces


Press Release: “To better exploit the potential of ever-growing data in a trustworthy European framework, the Commission today proposes new rules on data governance. The Regulation will facilitate data sharing across the EU and between sectors to create wealth for society, increase control and trust of both citizens and companies regarding their data, and offer an alternative European model to data handling practice of major tech platforms.

The amount of data generated by public bodies, businesses and citizens is constantly growing. It is expected to multiply by five between 2018 and 2025. These new rules will allow this data to be harnessed and will pave the way for sectoral European data spaces to benefit society, citizens and companies. In the Commission’s data strategy of February this year, nine such data spaces have been proposed, ranging from industry to energy, and from health to the European Green Deal. They will, for example, contribute to the green transition by improving the management of energy consumption, make delivery of personalised medicine a reality, and facilitate access to public services.

The Regulation includes:

  • A number of measures to increase trust in data sharing, as the lack of trust is currently a major obstacle and results in high costs.
  • Create new EU rules on neutrality to allow novel data intermediaries to function as trustworthy organisers of data sharing.
  • Measures to facilitate the reuse of certain data held by the public sector. For example, the reuse of health data could advance research to find cures for rare or chronic diseases.
  • Means to give Europeans control on the use of the data they generate, by making it easier and safer for companies and individuals to voluntarily make their data available for the wider common good under clear conditions….(More)”.

Geospatial Data Market Study


Study by Frontier Economics: “Frontier Economics was commissioned by the Geospatial Commission to carry out a detailed economic study of the size, features and characteristics of the UK geospatial data market. The Geospatial Commission was established within the Cabinet Office in 2018, as an independent, expert committee responsible for setting the UK’s Geospatial Strategy and coordinating public sector geospatial activity. The Geospatial Commission’s aim is to unlock the significant economic, social and environmental opportunities offered by location data. The UK’s Geospatial Strategy (2020) sets out how the UK can unlock the full power of location data and take advantage of the significant economic, social and environmental opportunities offered by location data….

Like many other forms of data, the value of geospatial data is not limited to the data creator or data user. Value from using geospatial data can be subdivided into several different categories, based on who the value accrues to:

Direct use value: where value accrues to users of geospatial data. This could include government using geospatial data to better manage public assets like roadways.

Indirect use value: where value is also derived by indirect beneficiaries who interact with direct users. This could include users of the public assets who benefit from better public service provision.

Spillover use value: value that accrues to others who are not a direct data user or indirect beneficiary. This could, for example, include lower levels of emissions due to improvement management of the road network by government. The benefits of lower emissions are felt by all of society even those who do not use the road network.

As the value from geospatial data does not always accrue to the direct user of the data, there is a risk of underinvestment in geospatial technology and services. Our £6 billion estimate of turnover for a subset of geospatial firms in 2018 does not take account of these wider economic benefits that “spill over” across the UK economy, and generate additional value. As such, the value that geospatial data delivers is likely to be significantly higher than we have estimated and is therefore an area for potential future investment….(More)”.

Interoperability as a tool for competition regulation


Paper by Ian Brown: “Interoperability is a technical mechanism for computing systems to work together – even if they are from competing firms. An interoperability requirement for large online platforms has been suggested by the European Commission as one ex ante (up-front rule) mechanism in its proposed Digital Markets Act (DMA), as a way to encourage competition. The policy goal is to increase choice and quality for users, and the ability of competitors to succeed with better services. The application would be to the largest online platforms, such as Facebook, Google, Amazon, smartphone operating systems (e.g. Android/iOS), and their ancillary services, such as payment and app stores.

This report analyses up-front interoperability requirements as a pro-competition policy tool for regulating large online platforms, exploring the economic and social rationales and possible regulatory mechanisms. It is based on a synthesis of recent comprehensive policy re-views of digital competition in major industrialised economies, and related academic literature, focusing on areas of emerging consensus while noting important disagreements. It draws particularly on the Vestager, Furman and Stigler reviews, and the UK Competition and Markets Authority’s study on online platforms and digital advertising. It also draws on interviews with software developers, platform operators, government officials, and civil society experts working in this field….(More)”.

‘It gave me hope in democracy’: how French citizens are embracing people power


Peter Yeung at The Guardian: “Angela Brito was driving back to her home in the Parisian suburb of Seine-et-Marne one day in September 2019 when the phone rang. The 47-year-old caregiver, accustomed to emergency calls, pulled over in her old Renault Megane to answer. The voice on the other end of the line informed her she had been randomly selected to take part in a French citizens’ convention on climate. Would she, the caller asked, be interested?

“I thought it was a real prank,” says Brito, a single mother of four who was born in the south of Portugal. “I’d never heard anything about it before. But I said yes, without asking any details. I didn’t believe it.’”

Brito received a letter confirming her participation but she still didn’t really take it seriously. On 4 October, the official launch day, she got up at 7am as usual and, while driving to meet her first patient of the day, heard a radio news item on how 150 ordinary citizens had been randomly chosen for this new climate convention. “I said to myself, ah, maybe it was true,” she recalls.

At the home of her second patient, a good-humoured old man in a wheelchair, the TV news was on. Images of the grand Art Déco-style Palais d’Iéna, home of the citizens’ gathering, filled the screen. “I looked at him and said, ‘I’m supposed to be one of those 150,’” says Brito. “He told me, ‘What are you doing here then? Leave, get out, go there!’”

Brito had two hours to get to the Palais d’Iéna. “I arrived a little late, but I arrived!” she says.

Over the next nine months, Brito would take part in the French citizens’ convention for the climate, touted by Emmanuel Macron as an “unprecedented democratic experiment”, which would bring together 150 people aged 16 upwards, from all over France and all walks of French life – to learn, debate and then propose measures to reduce greenhouse gas emissions by at least 40% by 2030. By the end of the process, Brito and her fellow participants had convinced Macron to pledge an additional €15bn (£13.4bn) to the climate cause and to accept all but three of the group’s 149 recommendations….(More)”.

Macron’s green democracy experiment gets political


Louise Guillot and Elisa Braun at Politico: “Emmanuel Macron asked 150 ordinary people to help figure out France’s green policies — and now this citizens’ convention is turning into a political problem for the French president.

The Citizens’ Convention on Climate was aimed at calming tensions in the wake of the Yellow Jackets protest movement — which was sparked by a climate tax on fuel — and showing that Macron wasn’t an out-of-touch elitist.

After nine months of deliberations, the convention came up with 149 proposals to slash greenhouse gas emissions this summer. The government has to put some of these measures before the parliament for them to become binding, and a bill is due to be presented in December.

But that’s too slow for many of the convention’s members, who feel the government is back-pedalling on some of the ideas and that Macron has poked fun at them.

Muriel Raulic, a member of the convention, accused Macron of using the body to greenwash his administration.

She supports a moratorium on 5G high-speed mobile technology, which has created some health and environmental fears. Macron has dismissed proponents of the ban as “Amish” — a Christian sect suspicious of technology.

The 150 members wrote an open letter to Macron in mid-October, complaining about a lack of “clear and defined support from the executive, whose positions sometimes appear contradictory,” and to “openly hostile communications” from “certain professional actors.”

Some gathered late last month before the National Assembly to complain they felt used and treated like “guinea pigs” by politicians. In June, they created an association to oversee what the government is doing with their proposals. 

…The government denied it is using the convention to greenwash itself….(More)”.

A Legal Framework for Access to Data – A Competition Policy Perspective


Paper by Heike Schweitzer and Robert Welker: “The paper strives to systematise the debate on access to data from a competition policy angle. At the outset, two general policy approaches to access to data are distinguished: a “private control of data” approach versus an “open access” approach. We argue that, when it comes to private sector data, the “private control of data” approach is preferable. According to this approach, the “whether” and “how” of data access should generally be left to the market. However, public intervention can be justified by significant market failures. We discuss the presence of such market failures and the policy responses, including, in particular, competition policy responses, with a view to three different data access scenarios: access to data by co-generators of usage data (Scenario 1); requests for access to bundled or aggregated usage data by third parties vis-à-vis a service or product provider who controls such datasets, with the goal to enter complementary markets (Scenario 2); requests by firms to access the large usage data troves of the Big Tech online platforms for innovative purposes (Scenario 3). On this basis we develop recommendations for data access policies….(More)”.

Policy making in a digital world


Report by Lewis Lloyd: “…Policy makers across government lack the necessary skills and understanding to take advantage of digital technologies when tackling problems such as coronavirus and climate change. This report says already poor data management has been exacerbated by a lack of leadership, with the role of government chief data officer unfilled since 2017. These failings have been laid bare by the stuttering coronavirus Test and Trace programme. Drawing on interviews with policy experts and digital specialists inside and outside government, the report argues that better use of data and new technologies, such as artificial intelligence, would improve policy makers’ understanding of problems like coronavirus and climate change, and aid collaboration with colleagues, external organisations and the public in seeking solutions to them. It urges government to trial innovative applications of data and technology to ​a wider range of policies, but warns recent failures such as the A-level algorithm fiasco mean it must also do more to secure public trust in its use of such technologies. This means strengthening oversight and initiating a wider public debate about the appropriate use of digital technologies, and improving officials’ understanding of the limitations of data-driven analysis. The report recommends that the government:

  1. Appoints a chief data officer as soon as possible to drive work on improving data quality, tackle problems with legacy IT and make sure new data standards are applied and enforced across government.
  2. ​Places more emphasis on statistical and technological literacy when recruiting and training policy officials.
  3. Sets up a new independent body to lead on public engagement in policy making, with an initial focus on how and when government should use data and technology…(More)”.

Data Privacy Increasingly a Focus of National Security Reviews


Paper by Tamara Ehs, and Monika Mokre: “The yellow vest movement started in November 2018 and has formed the longest protest movement in France since 1945. The movement provoked different reactions of the French government—on the one hand, violence and repression; on the other hand, concessions. One of them was to provide a possibility for citizens’ participation by organizing the so-called “Grand Débat.” It was clear to all observers that this was less an attempt to further democracy in France than to calm down the protests of the yellow vests. Thus, it seemed doubtful from the beginning whether this form of participatory democracy could be understood as a real form of citizens’ deliberation, and in fact, several shortcomings with regard to procedure and participation were pointed out by theorists of deliberative democracy. The aim of this article is to analyze the Grand Débat with regard to its deliberative qualities and shortcomings….(More)”.