The EU Wants to Build One of the World’s Largest Biometric Databases. What Could Possibly Go Wrong?


Grace Dobush at Fortune: “China and India have built the world’s largest biometric databases, but the European Union is about to join the club.

The Common Identity Repository (CIR) will consolidate biometric data on almost all visitors and migrants to the bloc, as well as some EU citizens—connecting existing criminal, asylum, and migration databases and integrating new ones. It has the potential to affect hundreds of millions of people.

The plan for the database, first proposed in 2016 and approved by the EU Parliament on April 16, was sold as a way to better track and monitor terrorists, criminals, and unauthorized immigrants.

The system will target the fingerprints and identity data for visitors and immigrants initially, and represents the first step towards building a truly EU-wide citizen database. At the same time, though, critics argue its mere existence will increase the potential for hacks, leaks, and law enforcement abuse of the information….

The European Parliament and the European Council have promised to address those concerns, through “proper safeguards” to protect personal privacy and to regulate officers’ access to data. In 2016, they passed a law regarding law enforcement’s access to personal data, alongside General Data Protection Regulation or GDPR.

But total security is a tall order. Germany is currently dealing with multipleinstances of police officers allegedly leaking personal information to far-right groups. Meanwhile, a Swedish hacker went to prison for hacking into Denmark’s public records system in 2012 and dumping online the personal data of hundreds of thousands of citizens and migrants….(More)”.


Facebook will open its data up to academics to see how it impacts elections


MIT Technology Review: “More than 60 researchers from 30 institutions will get access to Facebook user data to study its impact on elections and democracy, and how it’s used by advertisers and publishers.

A vast trove: Facebook will let academics see which websites its users linked to from January 2017 to February 2019. Notably, that means they won’t be able to look at the platform’s impact on the US presidential election in 2016, or on the Brexit referendum in the UK in the same year.

Despite this slightly glaring omission, it’s still hard to wrap your head around the scale of the data that will be shared, given that Facebook is used by 1.6 billion people every day. That’s more people than live in all of China, the most populous country on Earth. It will be one of the largest data sets on human behavior online to ever be released.

The process: Facebook didn’t pick the researchers. They were chosen by the Social Science Research Council, a US nonprofit. Facebook has been working on this project for over a year, as it tries to balance research interests against user privacy and confidentiality.

Privacy: In a blog post, Facebook said it will use a number of statistical techniques to make sure the data set can’t be used to identify individuals. Researchers will be able to access it only via a secure portal that uses a VPN and two-factor authentication, and there will be limits on the number of queries they can each run….(More)”.

Nagging misconceptions about nudge theory


Cass Sunstein at The Hill: “Nudges are private or public initiatives that steer people in particular directions but that also allow them to go their own way.

A reminder is a nudge; so is a warning. A GPS device nudges; a default rule, automatically enrolling people in some program, is a nudge.

To qualify as a nudge, an initiative must not impose significant economic incentives. A subsidy is not a nudge; a tax is not a nudge; a fine or a jail sentence is not a nudge. To count as such, a nudge must fully preserve freedom of choice.

In 2009, University of Chicago economist Richard Thaler and I co-wrote a book that drew on research in psychology and behavioral economics to help people and institutions, both public and private, improve their decision-making.

In the 10 years since “Nudge” was published, there has been an extraordinary outpouring of new thought and action, with particular reference to public policy.

Behavioral insight teams, or “nudge units” of various sorts, can be found in many nations, including Australia, Canada, Denmark, United Kingdom, the United States, the Netherlands, Germany, Singapore, Japan and Qatar.

Those teams are delivering. By making government more efficient, and by improving safety and health, they are helping to save a lot of money and a lot of lives. And in many countries, including the U.S., they don’t raise partisan hackles; both Democrats and Republicans have enthusiastically embraced them.   

Still, there are a lot of mistakes and misconceptions out there, and they are diverting attention and hence stalling progress. Here are the three big ones:

1. Nudges do not respect freedom. …

2. Nudges are based on excessive trust in government...

3. Nudges cannot achieve a whole lot.…(More)”.

Reconnecting citizens with EU decision-making is possible – and needs to happen now


Opinion piece by Anthony Zacharzewski: “Maybe it’s the Brexit effect, or perhaps the memories of the great recession are fading, but in poll after poll, Europe’s citizens are saying that they feel more European and strongly supportive of EU membership. …

While sighs of relief can be heard from Schuman to Strasbourg, after a decade where the EU has bounced from crisis to crisis, the new Parliament and Commission will inherit a fragile and fractious Europe this year. One of their most important tasks will immediately be to connect EU citizens more closely to the institutions and their decision making….

The new European Commission and Parliament have the chance to change that, by adopting an ambitious open government agenda that puts citizen participation in decision making at its heart.

There are three things on our wish list for doing this.

The first thing on our list is an EU-wide commitment to policy making “in the open.” Built on a renewed commitment to transparency, it would set a unified approach to consultation, as well as identifying major policy areas where citizen involvement is both valuable and where citizens are likely to want to be involved. This could include issues such as migration and climate change. Member states, particularly those who are in the Open Government Partnership, have already had a lot of good practice which can help to inform this while the Open Government Network for Europe, which brings together civil society and government voices, is ready to help.

Secondly, the connection to civil society and citizens also needs to be made beyond the European level, supporting and making use of the rapidly growing networks of democratic innovation at local level. We are seeing an increasing shift from citizen participation as one-off events into a part of the governing system, and as such, the European institutions need to listen to local conversations and support them with better information. Public Square, our own project run in partnership with mySociety and funded by Luminate, is a good example. It is working with local government and citizens to understand how meaningful citizen participation can become an everyday part of the way all local decision-making happens.

The last item on our wish list would be greater coherence between the institutions in Brussels and Strasbourg to better involve citizens. While the European Parliament, Commission and Council all have their different roles and prerogatives, without a co-ordinated approach, the attention and resources they have will be dissipated across multiple conversations. Most importantly, it will be harder to demonstrate to citizens that their contributions have made a difference….(More)”.

How Technology Could Revolutionize Refugee Resettlement


Krishnadev Calamur in The Atlantic: “… For nearly 70 years, the process of interviewing, allocating, and accepting refugees has gone largely unchanged. In 1951, 145 countries came together in Geneva, Switzerland, to sign the Refugee Convention, the pact that defines who is a refugee, what refugees’ rights are, and what legal obligations states have to protect them.

This process was born of the idealism of the postwar years—an attempt to make certain that those fleeing war or persecution could find safety so that horrific moments in history, such as the Holocaust, didn’t recur. The pact may have been far from perfect, but in successive years, it was a lifeline to Afghans, Bosnians, Kurds, and others displaced by conflict.

The world is a much different place now, though. The rise of populism has brought with it a concomitant hostility toward immigrants in general and refugees in particular. Last October, a gunman who had previously posted anti-Semitic messages online against HIAS killed 11 worshippers in a Pittsburgh synagogue. Many of the policy arguments over resettlement have shifted focus from humanitarian relief to security threats and cost. The Trump administration has drastically cut the number of refugees the United States accepts, and large parts of Europe are following suit.

If it works, Annie could change that dynamic. Developed at Worcester Polytechnic Institute in Massachusetts, Lund University in Sweden, and the University of Oxford in Britain, the software uses what’s known as a matching algorithm to allocate refugees with no ties to the United States to their new homes. (Refugees with ties to the United States are resettled in places where they have family or community support; software isn’t involved in the process.)

Annie’s algorithm is based on a machine learning model in which a computer is fed huge piles of data from past placements, so that the program can refine its future recommendations. The system examines a series of variables—physical ailments, age, levels of education and languages spoken, for example—related to each refugee case. In other words, the software uses previous outcomes and current constraints to recommend where a refugee is most likely to succeed. Every city where HIAS has an office or an affiliate is given a score for each refugee. The higher the score, the better the match.

This is a drastic departure from how refugees are typically resettled. Each week, HIAS and the eight other agencies that allocate refugees in the United States make their decisions based largely on local capacity, with limited emphasis on individual characteristics or needs….(More)”.

How Ireland’s Citizens’ Assembly helped climate action


Blog post by Frances Foley: “..In July 2016, the new government – led by Fine Gael, backed by independents – put forward a bill to establish a national-level Citizens’ Assembly to look at the biggest issues of the day. These included the challenges of an ageing population; the role fixed-term parliaments; referendums; the 8th Amendment on abortion; and climate change.

Citizens from every region, every socio-economic background, each ethnicity and age group and from right across the spectrum of political opinion convened over the course of two weekends between September and November 2017. The issue seemed daunting in scale and complexity, but the participants had been well-briefed and had at their disposal a line up of experts, scientists, advocates and other witnesses who would help them make sense of the material. By the end, citizens had produced a radical series of recommendations which went far beyond what any major Irish party was promising, surprising even the initiators of the process….

As expected, the passage for some of the proposals through the Irish party gauntlet has not been smooth. The 8-hour long debate on increasing the carbon tax, for example, suggests that mixing deliberative and representative democracy still produces conflict and confusion. It is certainly clear that parliaments have to adapt and develop if citizens’ assemblies are ever to find their place in our modern democracies.

But the most encouraging move has been the simple acknowledgement that many of the barriers to implementation lie at the level of governance. The new Climate Action Commission, with a mandate to monitor climate action across government, should act as the governmental guarantor of the vision from the Citizens’ Assembly. Citizens’ proposals have themselves stimulated a review of internal government processes to stop their demands getting mired in party wrangling and government bureaucracy. By their very nature, the success of citizens’ assemblies can also provide an alternative vision of how decisions can be made – and in so doing shame political parties and parliaments into improving their decision-making practices.

Does the Irish Citizens’ Assembly constitute a case of rapid transition? In terms of its breadth, scale and vision, the experiment is impressive. But in terms of speed, deliberative processes are often criticised for being slow, unwieldly and costly. The response to this should be to ask what we’re getting: whilst an Assembly is not the most rapid vehicle for change – most serious processes take several months, if not a couple of years – the results, both in specific outcomes and in cultural or political shifts – can be astounding….

In respect to climate change, this harmony between ends and means is particularly significant. The climate crisis is the most severe collective decision-making challenge of our times, one that demands courage, but also careful thought….(More)”.

Many Across the Globe Are Dissatisfied With How Democracy Is Working


Pew Research Center: “Anger at political elites, economic dissatisfaction and anxiety about rapid social changes have fueled political upheaval in regions around the world in recent years. Anti-establishment leaders, parties and movements have emerged on both the right and left of the political spectrum, in some cases challenging fundamental norms and institutions of liberal democracy. Organizations from Freedom House to the Economist Intelligence Unit to V-Demhave documented global declines in the health of democracy.

As previous Pew Research Center surveys have illustrated, ideas at the core of liberal democracy remain popular among global publics, but commitment to democracy can nonetheless be weak. Multiple factors contribute to this lack of commitment, including perceptions about how well democracy is functioning. And as findings from a new Pew Research Center survey show, views about the performance of democratic systems are decidedly negative in many nations. Across 27 countries polled, a median of 51% are dissatisfied with how democracy is working in their country; just 45% are satisfied.

Assessments of how well democracy is working vary considerably across nations. In Europe, for example, more than six-in-ten Swedes and Dutch are satisfied with the current state of democracy, while large majorities in Italy, Spain and Greece are dissatisfied.

To better understand the discontent many feel with democracy, we asked people in the 27 nations studied about a variety of economic, political, social and security issues. The results highlight some key areas of public frustration: Most believe elections bring little change, that politicians are corrupt and out of touch and that courts do not treat people fairly. On the other hand, people are more positive about how well their countries protect free expression, provide economic opportunity and ensure public safety.

We also asked respondents about other topics, such as the state of the economy, immigration and attitudes toward major political parties. And in Europe, we included additional questions about immigrants and refugees, as well as opinions about the European Union….(More)”.

Politics and Technology in the Post-Truth Era


Book edited by Anna Visvizi and Miltiadis D. Lytras: “Advances in information and communication technology (ICT) have directly impacted the way in which politics operates today. Bringing together research on Europe, the US, South America, the Middle East, Asia and Africa, this book examines the relationship between ICT and politics in a global perspective.

Technological innovations such as big data, data mining, sentiment analysis, cognitive computing, artificial intelligence, virtual reality, augmented reality, social media and blockchain technology are reshaping the way ICT intersects with politics and in this collection contributors examine these developments, demonstrating their impact on the political landscape. Chapters examine topics such as cyberwarfare and propaganda, post-Soviet space, Snowden, US national security, e-government, GDPR, democratization in Africa and internet freedom.


Providing an overview of new research on the emerging relationship between the promise and potential inherent in ICT and its impact on politics, this edited collection will prove an invaluable text for students, researchers and practitioners working in the fields of Politics, International Relations and Computer Science…..(More)”

Whose Commons? Data Protection as a Legal Limit of Open Science


Mark Phillips and Bartha M. Knoppers in the Journal of Law, Medicine and Ethics: “Open science has recently gained traction as establishment institutions have come on-side and thrown their weight behind the movement and initiatives aimed at creation of information commons. At the same time, the movement’s traditional insistence on unrestricted dissemination and reuse of all information of scientific value has been challenged by the movement to strengthen protection of personal data. This article assesses tensions between open science and data protection, with a focus on the GDPR.

Powerful institutions across the globe have recently joined the ranks of those making substantive commitments to “open science.” For example, the European Commission and the NIH National Cancer Institute are supporting large-scale collaborations, such as the Cancer Genome Collaboratory, the European Open Science Cloud, and the Genomic Data Commons, with the aim of making giant stores of genomic and other data readily available for analysis by researchers. In the field of neuroscience, the Montreal Neurological Institute is midway through a novel five-year project through which it plans to adopt open science across the full spectrum of its research. The commitment is “to make publicly available all positive and negative data by the date of first publication, to open its biobank to registered researchers and, perhaps most significantly, to withdraw its support of patenting on any direct research outputs.” The resources and influence of these institutions seem to be tipping the scales, transforming open science from a longstanding aspirational ideal into an existing reality.

Although open science lacks any standard, accepted definition, one widely-cited model proposed by the Austria-based advocacy effort openscienceASAP describes it by reference to six principles: open methodology, open source, open data, open access, open peer review, and open educational resources. The overarching principle is “the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process.” This article adopts this principle as a working definition of open science, with a particular emphasis on open sharing of human data.

As noted above, many of the institutions committed to open science use the word “commons” to describe their initiatives, and the two concepts are closely related. “Medical information commons” refers to “a networked environment in which diverse sources of health, medical, and genomic information on large populations become widely shared resources.” Commentators explicitly link the success of information commons and progress in the research and clinical realms to open science-based design principles such as data access and transparent analysis (i.e., sharing of information about methods and other metadata together with medical or health data).

But what legal, as well as ethical and social, factors will ultimately shape the contours of open science? Should all restrictions be fought, or should some be allowed to persist, and if so, in what form? Given that a commons is not a free-for-all, in that its governing rules shape its outcomes, how might we tailor law and policy to channel open science to fulfill its highest aspirations, such as universalizing practical access to scientific knowledge and its benefits, and avoid potential pitfalls? This article primarily concerns research data, although passing reference is also made to the approach to the terms under which academic publications are available, which are subject to similar debates….(More)”.

The Importance of Data Access Regimes for Artificial Intelligence and Machine Learning


JRC Digital Economy Working Paper by Bertin Martens: “Digitization triggered a steep drop in the cost of information. The resulting data glut created a bottleneck because human cognitive capacity is unable to cope with large amounts of information. Artificial intelligence and machine learning (AI/ML) triggered a similar drop in the cost of machine-based decision-making and helps in overcoming this bottleneck. Substantial change in the relative price of resources puts pressure on ownership and access rights to these resources. This explains pressure on access rights to data. ML thrives on access to big and varied datasets. We discuss the implications of access regimes for the development of AI in its current form of ML. The economic characteristics of data (non-rivalry, economies of scale and scope) favour data aggregation in big datasets. Non-rivalry implies the need for exclusive rights in order to incentivise data production when it is costly. The balance between access and exclusion is at the centre of the debate on data regimes. We explore the economic implications of several modalities for access to data, ranging from exclusive monopolistic control to monopolistic competition and free access. Regulatory intervention may push the market beyond voluntary exchanges, either towards more openness or reduced access. This may generate private costs for firms and individuals. Society can choose to do so if the social benefits of this intervention outweigh the private costs.

We briefly discuss the main EU legal instruments that are relevant for data access and ownership, including the General Data Protection Regulation (GDPR) that defines the rights of data subjects with respect to their personal data and the Database Directive (DBD) that grants ownership rights to database producers. These two instruments leave a wide legal no-man’s land where data access is ruled by bilateral contracts and Technical Protection Measures that give exclusive control to de facto data holders, and by market forces that drive access, trade and pricing of data. The absence of exclusive rights might facilitate data sharing and access or it may result in a segmented data landscape where data aggregation for ML purposes is hard to achieve. It is unclear if incompletely specified ownership and access rights maximize the welfare of society and facilitate the development of AI/ML…(More)”