“European Digital Sovereignty”: Successfully Navigating Between the “Brussels Effect” and Europe’s Quest for Strategic Autonomy


Paper by Theodore Christakis: “This Study discusses extensively the concept of “European Digital Sovereignty”. It presents the opportunities opened by the concept but also the risks and pitfalls. It provides a panorama of the major debates and developments related to digital/cyber issues in Europe. Here is the Executive Summary of the Study:

“The Times They Are A-Changin”. When Jean-Claude Juncker, then President of the European Commission, proclaimed in 2018 that “The Hour of European Sovereignty” had come, half of Europe criticized him, recalls Paul Timmers. Today hardly a day goes by in Europe, without a politician talking about “digital sovereignty”.

From a purely normative point of view, the concept makes little sense. It can only further accentuate the classic confusion surrounding the use of the term “sovereignty”, which is one of the most equivocal terms in legal theory and which has been criticized by a famous scholar for often being nothing more than “a catchword, a substitute for thinking and precision”. Still, from a political point of view, “European digital sovereignty” is an extremely powerful concept, broad and ambiguous enough to encompass very different things and to become a “projection surface for a wide variety of political demands”.

This study analyses in a detailed way the two understandings of the term: sovereignty as regulatory power; and, sovereignty as strategic autonomy and the ability to act in the digital sphere without being restricted to an undesired extent by external dependencies. While doing so, this study presents a panorama of the most recent le-gislative proposals (such as the Data Governance Act) and the most important debates on digital issues currently taking place at the European level: 5G deployment and cybersecurity concerns; international data transfers and foreign governments’ access to data after SchremsII; cloud computing; the digital services tax; competition law; content moderation; artificial intelligence regulation; and so many others….(More)”.

Connected Devices – an Unfair Competition Law Approach to Data Access Rights of Users


Paper by Josef Drexl: “On the European level, promoting the free flow of data and access to data has moved to the forefront of the policy goals concerning the digital economy. A particular aspect of this economy is the advent of connected devices that are increasingly deployed and used in the context of the Internet of Things (IoT). As regards these devices, the Commission has identified the particular problem that the manufacturers may try to remain in control of the data and refuse data access to third parties, thereby impeding the development of innovative business models in secondary data-related markets. To address this issue, this paper discusses potential legislation on data access rights of the users of connected devices. The paper conceives refusals of the device manufacturers to grant access to data vis-à-vis users as a form of unfair trading practice and therefore recommends embedding data access rights of users in the context of the European law against unfair competition. Such access rights would be complementary to other access regimes, including sector-specific data access rights of competitors in secondary markets as well as access rights available under contract and competition law. Against the backdrop of ongoing debates to reform contract and competition law for the purpose of enhancing data access, the paper seeks to draw attention to a so far not explored unfair competition law approach….(More)”.

Mapping urban temperature using crowd-sensing data and machine learning


Paper by Marius Zumwald, Benedikt Knüsel, David N.Bresch and Reto Knutti: :”Understanding the patterns of urban temperature a high spatial and temporal resolution is of large importance for urban heat adaptation and mitigation. Machine learning offers promising tools for high-resolution modeling of urban heat, but it requires large amounts of data. Measurements from official weather stations are too sparse but could be complemented by crowd-sensed measurements from citizen weather stations (CWS). Here we present an approach to model urban temperature using the quantile regression forest algorithm and CWS, open government and remote sensing data. The analysis is based on data from 691 sensors in the city of Zurich (Switzerland) during a heat wave using data from for 25-30th June 2019. We trained the model using hourly data from for 25-29th June (n = 71,837) and evaluate the model using data from June 30th (n = 14,105). Based on the model, spatiotemporal temperature maps of 10 × 10 m resolution were produced. We demonstrate that our approach can accurately map urban heat at high spatial and temporal resolution without additional measurement infrastructure. We furthermore critically discuss and spatially map estimated prediction and extrapolation uncertainty. Our approach is able to inform highly localized urban policy and decision-making….(More)”.

Feasibility study on the possible introduction of a mechanism for certifying artificial intelligence tools & services


Press Release by the Council of Europe: “The European Commission for the Efficiency of Justice (CEPEJ) has adopted a feasibility study on the possible establishment of a certification mechanism for artificial intelligence tools and services. The study is based on the CEPEJ Charter on the use of artificial intelligence in judicial systems and their environment, adopted in December 2018. The Council of Europe, if it decides to create such a mechanism, could be a pioneer in this field. After consultation with all member and observer states, this feasibility study will be followed by an action plan that the CEPEJ will prepare and send to the Committee of Ministers for examination in 2021….(Study)”.

People understand statistics better than politicians think


Sarah O’Connor at the Financial Times: “In 2015 I took my reporter’s notebook to Liverpool because statistics suggested it was enjoying a jobs boom. The unemployment gap between the northern English city and the national average had shrunk to the smallest in a decade. When I mentioned that fact to people I met, I might as well have said the grass was pink.

“It’s certainly not our experience, I would say I’ve never seen poverty at this level,” was the response from the director of the local Citizens Advice Bureau. A woman who ran a small cake business said: “My cynical side thinks straight away they’ve probably got zero-hours contracts somewhere — [they] are a great way of cooking the books.”

I thought of that trip when I read a newly published study that uses an in-depth survey and focus groups to delve into the British public’s understanding of economics. The headline findings are bleak. Large parts of the public have misperceptions about how economic concepts such as the unemployment rate are measured and they are “sceptical and cynical” about data.

One obvious response would be to blame inadequate education and worry that economic ignorance allows people to be duped by demagogues such as Nigel Farage in the UK and Donald Trump in the US.

Economic literacy classes in schools would certainly be a good idea, especially since most of those surveyed were “deeply interested” in the economy and regretted not understanding the details. But there’s more to this story. The public live and breathe the economy every day. If their first response to a statistic such as the unemployment rate is to say “that doesn’t feel right” (a common response in the focus groups) then perhaps it’s the economists who are missing something….(More)”.

John Snow, Cholera, and South London Reconsidered


Paper by Thomas Coleman: “John Snow, the London doctor often considered the father of modern epidemiology, analyzed 1849 and 1854 cholera mortality for a population of nearly half a million in South London. His aim was to convince skeptics and “prove the overwhelming influence which the nature of the water supply exerted over the mortality.” Snow’s analysis was innovative – he is commonly credited with the first application of both randomization as an instrumental variable and differences-in-differences (DiD).

This paper provides an historical review of Snow’s three approaches to analyzing the data: a direct comparison of mixed (quasi-randomized) populations; a primitive form of difference-in-differences; and a comparison of actual versus predicted mortality. Snow’s analysis did not convince his skeptics, and we highlight problems with his analysis. We then turn to a re-analysis of Snow’s evidence, casting his analysis in the modern forms of randomization as IV and DiD. The re-examination supports Snow’s claims that data demonstrated the influence of water supply. As a matter of historical interest this strengthens the argument against Snow’s skeptics. For modern practitioners, the data and analysis provide an example of modern statistical tools (randomization and DiD) and the complementary use of observational and (quasi) experimental data….(More)”

Berlin Declaration on Digital Society and Value-based Digital Government


European Commission: “…The Declaration follows up on the success of the Tallinn Declaration on eGovernment, which endorsed the key principles for digital public services put forward in the eGovernment Action Plan 2016-2020. The Berlin Declaration takes the user-centricity principles formulated in the Tallinn Declaration a step further by strengthening the pioneering role of public administrations in driving a value-based digital transformation of our European societies.

The Declaration acknowledges the public sector as an essential element for the European Single Market and a driving force for new and innovative technological solutions for public services and societal challenges. It emphasises that public authorities at all levels must lead by example to strengthen the tenets of the European Union.

To do so it sets out seven key principles with related policy action lines and national and EU level:

  1. Validity and respect of fundamental rights and democratic values in the digital sphere;
  2. Social participation and digital inclusion to shape the digital world;
  3. Empowerment and digital literacy, allowing all citizens to participate in the digital sphere;
  4. Trust and security in digital government interactions, allowing everyone to navigate the digital world safely, authenticate and be digitally recognised within the EU conveniently;
  5. Digital sovereignty and interoperability, as a key in ensuring the ability of citizens and public administrations to make decisions and act self-determined in the digital world;
  6. Human-centred systems and innovative technologies in the public sector, strengthening its pioneering role in the research on secure and trustworthy technology design;
  7. A resilient and sustainable digital society, preserving our natural foundations of life in line with the Green Deal and using digital technologies to enhance the sustainability of our health systems….(More)”.

Why Predictive Algorithms are So Risky for Public Sector Bodies


Paper by Madeleine Waller and Paul Waller: “This paper collates multidisciplinary perspectives on the use of predictive analytics in government services. It moves away from the hyped narratives of “AI” or “digital”, and the broad usage of the notion of “ethics”, to focus on highlighting the possible risks of the use of prediction algorithms in public administration. Guidelines for AI use in public bodies are currently available, however there is little evidence these are being followed or that they are being written into new mandatory regulations. The use of algorithms is not just an issue of whether they are fair and safe to use, but whether they abide with the law and whether they actually work.

Particularly in public services, there are many things to consider before implementing predictive analytics algorithms, as flawed use in this context can lead to harmful consequences for citizens, individually and collectively, and public sector workers. All stages of the implementation process of algorithms are discussed, from the specification of the problem and model design through to the context of their use and the outcomes.

Evidence is drawn from case studies of use in child welfare services, the US Justice System and UK public examination grading in 2020. The paper argues that the risks and drawbacks of such technological approaches need to be more comprehensively understood, and testing done in the operational setting, before implementing them. The paper concludes that while algorithms may be useful in some contexts and help to solve problems, it seems those relating to predicting real life have a long way to go to being safe and trusted for use. As “ethics” are located in time, place and social norms, the authors suggest that in the context of public administration, laws on human rights, statutory administrative functions, and data protection — all within the principles of the rule of law — provide the basis for appraising the use of algorithms, with maladministration being the primary concern rather than a breach of “ethics”….(More)”

Data Readiness: Lessons from an Emergency


The DELVE Initiative:  “Responding to the COVID-19 pandemic has required rapid decision-making in changing circumstances. Those decisions and their effects on the health and wealth of the nation can be better informed with data. Today, technologies that can acquire data are pervasive. Data is continually produced by devices like mobile phones, payment points and road traffic sensors. This creates opportunities for nowcasting of important metrics such as GDP, population movements and disease prevalence, which can be used to design policy interventions that are targeted to the needs of specific sectors or localities. The data collected as a by-product of daily activities is different to epidemiological or other population research data that might be used to drive the decisions of state. These new forms of data are happenstance, in that they are not originally collected with a particular research or policy question in mind but are created through the normal course of events in our digital lives, and our interactions with digital systems and services.

This happenstance data pertains to individual citizens and their daily activities. To be useful it needs to be anonymized, aggregated and statistically calibrated to provide meaningful metrics for robust decision making while managing concerns about individual privacy or business value. This process necessitates particular technical and domain expertise that is often found in academia, but it must be conducted in partnership with the industries, and public sector organisations, that collect or generate the data and government authorities that take action based on those insights. Such collaborations require governance mechanisms that can respond rapidly to emerging areas of need, a common language between partners about how data is used and how it is being protected, and careful stewardship to ensure appropriate balancing of data subjects’ rights and the benefit of using this data. This is the landscape of data readiness; the availability and quality of the UK nation’s data dictates our ability to respond in an agile manner to evolving events….(More)”.

Impact through Engagement: Co-production of administrative data research


Paper by Elizabeth Nelson and Frances Burns: “The Administrative Data Research Centre Northern Ireland (ADRC NI) is a research partnership between Queen’s University Belfast and Ulster University to facilitate access to linked administrative data for research purposes for public benefit and for evidence-based policy development. This requires a social licence extended by publics which is maintained by a robust approach to engagement and involvement.

Public engagement is central to the ADRC NI’s approach to research. Research impact is pursued and secured through robust engagement and co-production of research with publics and key stakeholders. This is done by focusing on data subjects (the cohort of people whose lives make up the datasets, placing value on experts by experience outside of academic knowledge, and working with public(s) as key data advocates, through project steering committees and targeted events with stakeholders. The work is led by a dedicated Public Engagement, Communications and Impact Manager.

While there are strengths and weaknesses to the ADRC NI approach, examples of successful partnerships and clear pathways to impact demonstrate its utility and ability to amplify the positive impact of administrative data research. Working with publics as data use becomes more ubiquitous in a post-COVID-19 world will become more critical. ADRC NI’s model is a potential way forward….(More)”.

See also Special Issue on Public Involvement and Engagement by the International Journal of Population Data Science.