Data Privacy Increasingly a Focus of National Security Reviews


Paper by Tamara Ehs, and Monika Mokre: “The yellow vest movement started in November 2018 and has formed the longest protest movement in France since 1945. The movement provoked different reactions of the French government—on the one hand, violence and repression; on the other hand, concessions. One of them was to provide a possibility for citizens’ participation by organizing the so-called “Grand Débat.” It was clear to all observers that this was less an attempt to further democracy in France than to calm down the protests of the yellow vests. Thus, it seemed doubtful from the beginning whether this form of participatory democracy could be understood as a real form of citizens’ deliberation, and in fact, several shortcomings with regard to procedure and participation were pointed out by theorists of deliberative democracy. The aim of this article is to analyze the Grand Débat with regard to its deliberative qualities and shortcomings….(More)”.

Blame the politicians, not the technology, for A-level fiasco


The Editorial Board at the Financial Times: “The soundtrack of school students marching through Britain’s streets shouting “f*** the algorithm” captured the sense of outrage surrounding the botched awarding of A-level exam grades this year. But the students’ anger towards a disembodied computer algorithm is misplaced. This was a human failure. The algorithm used to “moderate” teacher-assessed grades had no agency and delivered exactly what it was designed to do.

It is politicians and educational officials who are responsible for the government’s latest fiasco and should be the target of students’ criticism….

Sensibly designed, computer algorithms could have been used to moderate teacher assessments in a constructive way. Using past school performance data, they could have highlighted anomalies in the distribution of predicted grades between and within schools. That could have led to a dialogue between Ofqual, the exam regulator, and anomalous schools to come up with more realistic assessments….

There are broader lessons to be drawn from the government’s algo fiasco about the dangers of automated decision-making systems. The inappropriate use of such systems to assess immigration status, policing policies and prison sentencing decisions is a live danger. In the private sector, incomplete and partial data sets can also significantly disadvantage under-represented groups when it comes to hiring decisions and performance measures.

Given the severe erosion of public trust in the government’s use of technology, it might now be advisable to subject all automated decision-making systems to critical scrutiny by independent experts. The Royal Statistical Society and The Alan Turing Institute certainly have the expertise to give a Kitemark of approval or flag concerns.

As ever, technology in itself is neither good nor bad. But it is certainly not neutral. The more we deploy automated decision-making systems, the smarter we must become in considering how best to use them and in scrutinising their outcomes. We often talk about a deficit of trust in our societies. But we should also be aware of the dangers of over-trusting technology. That may be a good essay subject for next year’s philosophy A-level….(More)”.

Responsible innovation requires new workways, and courage


Article by Jon Simonsson, Chair of the Committee for Technological Innovation and Ethics (Komet) in Sweden: “People have said that in the present – the fourth industrial revolution – everything is possible. The ingredients are there – 5G, IoT, AI, drones and self-driving vehicles – as well as advanced knowledge about diagnosis and medication – and they are all rapidly evolving. Only the innovator sets the limitations for how to mix and bake with Technologies.

And right now, when the threat of the corona virus has almost shock-digitized both business and the public sector, the interest in new technology solutions has skyrocketed. Working remotely, moving things without human presence, or – most important – virus vaccines and medical treatment methods, have all become self-evident areas for intensified research and experimentation. But the laws and regulations surrounding these areas were often created for a completely different setting.

Rules are good. And there are usually very good reasons why an area is regulated. Some rules are intended to safeguard democratic rights or individual rights to privacy, others to control developments in a certain direction. The rules are required. Especially at the present when not only development of technology but also the technology uptake in society is accelerating. It takes time to develop laws and regulations, and the process of doing so is not in pace with the rapid development of technology. This creates risks in society. For example, risks related to the individual’s right to privacy, the economy or the environment. At the same time, gaps in regulation may be revealed, gaps that could lead to introduction of new and perhaps not desired solutions.

Would it be possible to find a middle ground and a more future oriented way to work with regulation? With rules that are clear, future-proof and developed with legally safe methods, but encourages and facilitates ethical and sustainable innovation?

Responsible development and use of new technology

The Government wants Sweden to be a leader in the responsible development and use of new technologies. The Swedish Committee for Technological Innovation and Ethics (Komet) works with policy development to create good conditions for innovation and competitiveness, while ensuring that development and dissemination of new technology is safe and secure. The Committee helps the Swedish government to proactively address improvements technology could create for citizens, business and society, but also to highlight the conflicting goals that may arise.

This includes raising ethical issues related to the rapid technological development. When almost everything is possible, we need to place particularly high demands on the compass, how we responsibly navigate the technology landscape. Not least during the corona pandemic, when we have seen how ethical boundaries have been moved for the use of surveillance technology.

An important objective of the Komet work is to instil courage in the public sector. Although innovators are often private, at the end of the day, it is the public sector that must enable, be willing to and dare to meet the demands of both business and society. It is the public sector’s role to ensure that the proper regulations are on the table. A balanced and future-oriented regulation which will be required for rapidly creating a sustainable world….(More)”.

Digital Technology and the Resurrection of Trust


Report by the Select Committee on Democracy and Digital Technologies (UK Parliament): “Democracy faces a daunting new challenge. The age where electoral activity was conducted through traditional print media, canvassing and door knocking, is rapidly vanishing. Instead it is dominated by digital and social media. They are now the source from which voters get most of their information and political messaging.

The digital and social media landscape is dominated by two behemoths–Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics. This has become acutely obvious in the current COVID-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives. Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.

Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society. If this is done well, in the ways we spell out in this Report, technology can become a servant of democracy rather than its enemy. There is a need for Government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents.

The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.

Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.

Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.

Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers….(More)”.

Permanent joint committees in Belgium: involving citizens in parliamentary debate


Article by Elisa Minsart and Vincent Jacquet: “Amidst wide public disillusionment with the institutions of representative democracy, political scientists, campaigners and politicians have intensified efforts to find an effective mechanism to narrow the gap between citizens and those who govern them. One of the most popular remedies in recent years – and one frequently touted as a way to break the Brexit impasse encountered by the UK political class in 2016-19 – is that of citizens’ assemblies. These deliberative forums gather diversified samples of the population, recruited through a process of random selection. Citizens who participate meet experts, deliberate on a specific public issue and make a range of recommendations for policy-making. Citizens’ assemblies are flourishing in many representative democracies – not least in the UK, with the current Climate Assembly UK and Citizens’ Assembly of Scotland. They show that citizens are able to deliberate on complex political issues and to deliver original proposals. 

For several years now, some public leaders, scholars and politicians have sought to integrate these democratic innovations into more traditional political structures. Belgium recently made a step in this direction. Each of Belgium’s three regions has its own parliament, with full legislative powers: on 13 November 2019, a proposition was approved to modify how the Parliament of the Brussels Region operates. The reform mandates the establishment of joint deliberative committees, on which members of the public will serve alongside elected representatives. This will enable ordinary people to deliberate with MPs on preselected themes and to formulate recommendations. The details of the process are currently still being drafted and the first commission is expected to launch at the end of 2020. Despite the COVID-19 crisis, drafting and negotiations with other parties have not been interrupted thanks to an online platform and a videoconference facility.

This experience has been inspired by other initiatives organised in Belgium. In 2011, the G1000 initiative brought together more than 700 randomly selected citizens to debate on different topics. This grassroots experiment attracted lots of public attention. In its aftermath, the different parliaments of the country launched their own citizens’ assemblies, designed to tackle specific local issues. Some international experiences also inspired the Brussels Region, in particular the first Irish Constitutional Convention (2012–2014). This assembly was composed of both elected representatives and randomly selected citizens, and led directly to a referendum that approved the legalisation of same-sex marriage. However, the present joint committees go well beyond these initiatives. Whereas both of these predecessors were ad hoc initiatives designed to resolve particular problems, the Brussels committees will be permanent and hosted at the heart of the parliament. Both of these aspects make the new committees a major innovation and entirely different from the predecessors that helped inspire them…(More)”.

German humanities scholars enlisted to end coronavirus lockdown


David Matthews at THE: “In contrast to other countries, philosophers, historians, theologians and jurists have played a major role advising the state as it seeks to loosen restrictions…

In the struggle against the new coronavirus, humanities academics have entered the fray – in Germany at least.

Arguably to a greater extent than has happened in the UK, France or the US, the country has enlisted the advice of philosophers, historians of science, theologians and jurists as it navigates the delicate ethical balancing act of reopening society while safeguarding the health of the public.

When the German federal government announced a slight loosening of restrictions on 15 April – allowing small shops to open and some children to return to school in May – it had been eagerly awaiting a report written by a 26-strong expert group containing only a minority of natural scientists and barely a handful of virologists and medical specialists.

Instead, this working group from the Leopoldina – Germany’s independent National Academy of Sciences dating back to 1652 – included historians of industrialisation and early Christianity, a specialist on the philosophy of law and several pedagogical experts.

This paucity of virologists earned the group a swipe from Markus Söder, minister-president of badly hit Bavaria, who has led calls in Germany for a tough lockdown (although earlier in the pandemic the Leopoldina did release a report written by more medically focused specialists).

But “the crisis is a complex one, it’s a systemic crisis” and so it needs to be dissected from every angle, argued Jürgen Renn, director of the Max Planck Institute for the History of Science, and one of those who wrote the crucial recommendations.

And Professor Renn – who earlier this year published a book on rethinking science in the Anthropocene – made the argument for green post-virus reconstruction. Urbanisation and deforestation have squashed mankind and wildlife together, making other animal-to-human disease transmissions ever more likely, he argued. “It’s not the only virus waiting out there,” he said.

Germany’s Ethics Council – which traces its roots back to the stem cell debates of the early 2000s and is composed of theologians, jurists, philosophers and other ethical thinkers – also contributed to a report at the end of March, warning that it was up to elected politicians, not scientists, to make the “painful decisions” weighing up the lockdown’s effect on health and its other side-effects….(More)“.

France asks its citizens how to meet its climate-change targets


The Economist on “An experiment in consultative democracy”: “A nurse, a roofer, an electrician, a former fireman, a lycée pupil, a photographer, a teacher, a marketing manager, an entrepreneur and a civil servant. Sitting on red velvet benches in a domed art-deco amphitheatre in Paris, they and 140 colleagues are part of an unusual democratic experiment in a famously centralised country. Their mission: to draw up measures to reduce French greenhouse-gas emissions by at least 40% by 2030, in line with an eu target that is otherwise in danger of being missed (and which the European Commission now wants to tighten). Six months ago, none of them had met. Now, they have just one month left to show that they can reinvent the French democratic process—and help save the planet. “It’s our moment,” Sylvain, one of the delegates, tells his colleagues from the podium. “We have the chance to propose something historic.”

On March 6th the “citizens’ climate convention” was due to begin its penultimate three-day sitting, the sixth since it began work last October. The convention is made up of a representative sample of the French population, selected by randomly generated telephone numbers. President Emmanuel Macron devised it in an attempt to calm the country after the gilets jaunes (yellow jackets) crisis of 2018. In response to the demand for less top-down decision-making, he first launched what he grandly called a “great national debate”, which took place a year ago. He also pledged the creation of a citizens’ assembly. It is designed to focus on precisely the conundrum that provoked the original protests against a rise in the carbon tax on motor fuel: how to make green policy palatable, efficient and fair.Already signed up?…(More)”.

Belgium’s experiment in permanent forms of deliberative democracy


Article by Min Reuchamps: In December 2019, the parliament of the Region of Brussels in Belgium amended its internal regulations to allow the formation of ‘deliberative committees’ composed of a mixture of members of the Regional Parliament and randomly selected citizens. This initiative follows innovative experiences in the German-speaking Community of Belgium, known as Ostbelgien, and the city of Madrid in establishing permanent forums of deliberative democracy earlier in 2019. Ostbelgien is now experiencing its first cycle of deliberations, whereas the Madrid forum has been short-lived after having been cancelled, after two meetings, by the new governing coalition of the city.

The experimentation in establishing permanent forums for direct citizen involvement constitutes an advance from hitherto deliberative processes which were one-off experiments, i.e. non-permanent procedures. The relatively large size of the Brussels Region, with over 1 200 000 inhabitants, means that the lessons will be key in understanding the opportunities and risks of ‘deliberative committees’ and their potential scalability….

Under the new rules, the Regional Parliament can setup a parliamentary committee composed of 15 (12 in the Cocof) parliamentarians and 45 (36 in the Cocof) citizens to draft recommendations on a given issue. Any inhabitant in Brussels who has attained 16 years of age has the chance to have a direct say in matters falling under the jurisdiction of the Brussels Regional Parliament and the Cocof. The citizen representatives will be drawn by lot in two steps:

  • A first draw among the whole population, so that every inhabitant has the same chance to be invited via a formal invitation letter from the Parliament;
  • A second draw among all the persons who have responded positively to the invitation by means of a sampling method following criteria to ensure a diverse and representative selection, at least in terms of gender, age, official languages of the Brussels-Capital Region, geographical distribution and level of education.

The participating parliamentarians will be the members of the standing parliamentary committee that covers the topic under deliberation. In the regional parliament, each standing committee is made up of 15 members (including both Dutch- and French-speakers), and in the Cocof Parliament, each standing committee is made of 12 members (only French-speakers)….(More)”.

Human-centred policy? Blending ‘big data’ and ‘thick data’ in national policy


Policy Lab (UK): “….Compared with quantitative data, ethnography creates different forms of data – what anthropologists call ‘thick data’. Complex social problems benefit from insights beyond linear, standardised evidence and this is where thick data shows its worth. In Policy Lab we have generated ethnographic films and analysis to sit alongside quantitative data, helping policy-makers to build a rich picture of current circumstances. 

On the other hand, much has been written about big data – data generated through digital interactions – whether it be traditional ledgers and spreadsheets or emerging use of artificial intelligence and the internet of things.  The ever-growing zettabytes of data can reveal a lot, providing a (sometimes real time) digital trail capturing and aggregating our individual choices, preferences, behaviours and actions.  

Much hyped, this quantitative data has great potential to inform future policy, but must be handled ethically, and also requires careful preparation and analysis to avoid biases and false assumptions creeping in. Three issues we have seen in our projects relate to:

  • partial data, for example not having data on people who are not digitally active, biasing the sample
  • the time-consuming challenge of cleaning up data, in a political context where time is often of the essence
  • the lack of data interoperability, where different localities/organisations capture different metrics

Through a number of Policy Lab projects we have used big data to see the big picture before then using thick data to zoom in to the detail of people’s lived experience.  Whereas big data can give us cumulative evidence at a macro, often systemic level, thick data provides insights at an individual or group level.  We have found the blending of ‘big data’ and ‘thick data’ – to be the sweet spot. 

This is a diagram of Policy Lab's model for combining big data and thick data.
Policy Lab’s model for combining big data and thick data (2020)

Policy Lab’s work develops data and insights into ideas for potential policy intervention which we can start to test as prototypes with real people. These operate at the ‘meso’ level (in the middle of the diagram above), informed by both the thick data from individual experiences and the big data at a population or national level. We have written a lot about prototyping for policy and are continuing to explore how you prototype a policy compared to say a digital service….(More)”.

Lack of guidance leaves public services in limbo on AI, says watchdog


Dan Sabbagh at the Guardian: “Police forces, hospitals and councils struggle to understand how to use artificial intelligence because of a lack of clear ethical guidance from the government, according to the country’s only surveillance regulator.

The surveillance camera commissioner, Tony Porter, said he received requests for guidance all the time from public bodies which do not know where the limits lie when it comes to the use of facial, biometric and lip-reading technology.

“Facial recognition technology is now being sold as standard in CCTV systems, for example, so hospitals are having to work out if they should use it,” Porter said. “Police are increasingly wearing body cameras. What are the appropriate limits for their use?

“The problem is that there is insufficient guidance for public bodies to know what is appropriate and what is not, and the public have no idea what is going on because there is no real transparency.”

The watchdog’s comments came as it emerged that Downing Street had commissioned a review led by the Committee on Standards in Public Life, whose chairman had called on public bodies to reveal when they use algorithms in decision making.

Lord Evans, a former MI5 chief, told the Sunday Telegraph that “it was very difficult to find out where AI is being used in the public sector” and that “at the very minimum, it should be visible, and declared, where it has the potential for impacting on civil liberties and human rights and freedoms”.

AI is increasingly deployed across the public sector in surveillance and elsewhere. The high court ruled in September that the police use of automatic facial recognition technology to scan people in crowds was lawful.

Its use by South Wales police was challenged by Ed Bridges, a former Lib Dem councillor, who noticed the cameras when he went out to buy a lunchtime sandwich, but the court held that the intrusion into privacy was proportionate….(More)”.