Missing Numbers


Introduction by Anna Powell-Smith of a new “blog on the data the government should collect, but doesn’t”: “…Over time, I started to notice a pattern. Across lots of different policy areas, it was impossible for governments to make good decisions because of a basic lack of data. There was always critical data that the state either didn’t collect at all, or collected so badly that it made change impossible.

Eventually, I decided that the power to not collect data is one of the most important and little-understood sources of power that governments have. This is why I’m writing Missing Numbers: to encourage others to ask “is this lack of data a deliberate ploy to get away with something”?

By refusing to amass knowledge in the first place, decision-makers exert power over over the rest of us. It’s time that this power was revealed, so we can have better conversations about what we need to know to run this country successfully.

A typical example

The government records and publishes data on how often each NHS hospital receives formal complaints. This is very helpful, because it means patients and the people who care for them can spot hospitals whose performance is worrying.

But the government simply doesn’t record data, even internally, on how often formal complaints are made about each Jobcentre. (That FOI response is from 2015, but I’ve confirmed it’s still true in 2019.) So it is impossible for it to know if some Jobcentres are being seriously mismanaged….(More)”.

The EU Wants to Build One of the World’s Largest Biometric Databases. What Could Possibly Go Wrong?


Grace Dobush at Fortune: “China and India have built the world’s largest biometric databases, but the European Union is about to join the club.

The Common Identity Repository (CIR) will consolidate biometric data on almost all visitors and migrants to the bloc, as well as some EU citizens—connecting existing criminal, asylum, and migration databases and integrating new ones. It has the potential to affect hundreds of millions of people.

The plan for the database, first proposed in 2016 and approved by the EU Parliament on April 16, was sold as a way to better track and monitor terrorists, criminals, and unauthorized immigrants.

The system will target the fingerprints and identity data for visitors and immigrants initially, and represents the first step towards building a truly EU-wide citizen database. At the same time, though, critics argue its mere existence will increase the potential for hacks, leaks, and law enforcement abuse of the information….

The European Parliament and the European Council have promised to address those concerns, through “proper safeguards” to protect personal privacy and to regulate officers’ access to data. In 2016, they passed a law regarding law enforcement’s access to personal data, alongside General Data Protection Regulation or GDPR.

But total security is a tall order. Germany is currently dealing with multipleinstances of police officers allegedly leaking personal information to far-right groups. Meanwhile, a Swedish hacker went to prison for hacking into Denmark’s public records system in 2012 and dumping online the personal data of hundreds of thousands of citizens and migrants….(More)”.


Facebook will open its data up to academics to see how it impacts elections


MIT Technology Review: “More than 60 researchers from 30 institutions will get access to Facebook user data to study its impact on elections and democracy, and how it’s used by advertisers and publishers.

A vast trove: Facebook will let academics see which websites its users linked to from January 2017 to February 2019. Notably, that means they won’t be able to look at the platform’s impact on the US presidential election in 2016, or on the Brexit referendum in the UK in the same year.

Despite this slightly glaring omission, it’s still hard to wrap your head around the scale of the data that will be shared, given that Facebook is used by 1.6 billion people every day. That’s more people than live in all of China, the most populous country on Earth. It will be one of the largest data sets on human behavior online to ever be released.

The process: Facebook didn’t pick the researchers. They were chosen by the Social Science Research Council, a US nonprofit. Facebook has been working on this project for over a year, as it tries to balance research interests against user privacy and confidentiality.

Privacy: In a blog post, Facebook said it will use a number of statistical techniques to make sure the data set can’t be used to identify individuals. Researchers will be able to access it only via a secure portal that uses a VPN and two-factor authentication, and there will be limits on the number of queries they can each run….(More)”.

Nagging misconceptions about nudge theory


Cass Sunstein at The Hill: “Nudges are private or public initiatives that steer people in particular directions but that also allow them to go their own way.

A reminder is a nudge; so is a warning. A GPS device nudges; a default rule, automatically enrolling people in some program, is a nudge.

To qualify as a nudge, an initiative must not impose significant economic incentives. A subsidy is not a nudge; a tax is not a nudge; a fine or a jail sentence is not a nudge. To count as such, a nudge must fully preserve freedom of choice.

In 2009, University of Chicago economist Richard Thaler and I co-wrote a book that drew on research in psychology and behavioral economics to help people and institutions, both public and private, improve their decision-making.

In the 10 years since “Nudge” was published, there has been an extraordinary outpouring of new thought and action, with particular reference to public policy.

Behavioral insight teams, or “nudge units” of various sorts, can be found in many nations, including Australia, Canada, Denmark, United Kingdom, the United States, the Netherlands, Germany, Singapore, Japan and Qatar.

Those teams are delivering. By making government more efficient, and by improving safety and health, they are helping to save a lot of money and a lot of lives. And in many countries, including the U.S., they don’t raise partisan hackles; both Democrats and Republicans have enthusiastically embraced them.   

Still, there are a lot of mistakes and misconceptions out there, and they are diverting attention and hence stalling progress. Here are the three big ones:

1. Nudges do not respect freedom. …

2. Nudges are based on excessive trust in government...

3. Nudges cannot achieve a whole lot.…(More)”.

Revisiting the causal effect of democracy on long-run development


Blog post by Markus Eberhardt: “In a recent paper, Acemoglu et al. (2019), henceforth “ANRR”, demonstrated a significant and large causal effect of democracy on long-run growth. By adopting a simple binary indicator for democracy, and accounting for the dynamics of development, these authors found that a shift to democracy leads to a 20% higher level of development in the long run.1

The findings are remarkable in three ways: 

  1. Previous research often emphasised that a simple binary measure for democracy was perhaps “too blunt a concept” (Persson and Tabellini 2006) to provide robust empirical evidence.
  2.  Positive effects of democracy on growth were typically only a “short-run boost” (Rodrik and Wacziarg 2005). 
  3. The empirical findings are robust across a host of empirical estimators with different assumptions about the data generating process, including one adopting a novel instrumentation strategy (regional waves of democratisation).

ANRR’s findings are important because, as they highlight in a column on Vox, there is “a belief that democracy is bad for economic growth is common in both academic political economy as well as the popular press.” For example, Posner (2010) wrote that “[d]ictatorship will often be optimal for very poor countries”. 

The simplicity of ANRR’s empirical setup, the large sample of countries, the long time horizon (1960 to 2010), and the robust positive – and remarkably stable – results across the many empirical methods they employ send a very powerful message against such doubts that democracy does cause growth.

I agree with their conclusion, but with qualifications. …(More)”.

Finding Crtl: Visions for the Future Internet


Nesta: “In March 2019, the World Wide Web turned thirty, and October will mark the fiftieth anniversary of the internet itself. These anniversaries offer us an important opportunity to reflect on the internet’s history, but also a chance to ponder its future.

While early internet pioneers dreamed of an internet that would be open, free and decentralised, the story of the internet today is mostly a story of loss of control. Just a handful of companies determine what we read, see and buy, where we work and where we live, who we vote for, who we love, and who we are. Many of us feel increasingly uneasy about these developments. We live in a world where new technologies happen to us; the average person has very little agency to change things within the current political and economic parameters.

Yet things don’t have to be this way. In a time where the future of the internet is usually painted as bleak and uncertain, we need positive visions about where we go next.

As part of the Next Generation Internet (NGI) initiative – the European Commission’s new flagship programme working on building a more democratic, inclusive and resilient internet – we have created this “visions book”, a collection of essays, short stories, poetry and artworks from over 30 contributors from 15 countries and five continents. Each contributor has a unique background, as most were selected via an open call for submissions held last autumn. As such, the book collects both established and emerging voices, all reflecting on the same crucial questions: where did we come from, but more importantly, where do we go next?

The NGI hopes to empower everyone to take active control in shaping the future: the internet does not just belong to those who hold power today, but to all of us….(More)”.

The Blockchain Game: A great new tool for your classroom


IBM Blockchain Blog: “Blockchain technology can be a game-changer for accounting, supply chainbanking, contract law, and many other fields. But it will only be useful if lots and lots of non-technical managers and leaders trust and adopt it. And right now, just understanding what blockchain is, can be difficult to understand even for the brightest in these fields. Enter The Blockchain Game, a hands-on exercise that explains blockchain’s core principals, and serves as a launching pad for discussion of blockchain’s real-world applications.

In The Blockchain Game students act as nodes and miners on a blockchain network for storing student grades at a university. Participants record the grade and course information, and then “build the block” by calculating a unique identifier (a hash) to secure the grade ledger, and miners get rewarded for their work. As the game is played, the audience learns about hashes, private keys, and what uses are appropriate for a blockchain ledger.

Basics of the Game

  • A hands-on simulation centering around a blockchain for academic scores, including a discussion at the end of the simulation regarding if storing grades would be a good application for blockchain.
  • No computers. Participants are the computors and calculate blocks.
  • The game seeks to teach core concepts about a distributed ledger but can be modified to whichever use case the educator wishes to use — smart contracts, supply chain, applications and others.
  • Additional elements can be added if instructors want to facilitate the game on a computer….(More)”.

How Ireland’s Citizens’ Assembly helped climate action


Blog post by Frances Foley: “..In July 2016, the new government – led by Fine Gael, backed by independents – put forward a bill to establish a national-level Citizens’ Assembly to look at the biggest issues of the day. These included the challenges of an ageing population; the role fixed-term parliaments; referendums; the 8th Amendment on abortion; and climate change.

Citizens from every region, every socio-economic background, each ethnicity and age group and from right across the spectrum of political opinion convened over the course of two weekends between September and November 2017. The issue seemed daunting in scale and complexity, but the participants had been well-briefed and had at their disposal a line up of experts, scientists, advocates and other witnesses who would help them make sense of the material. By the end, citizens had produced a radical series of recommendations which went far beyond what any major Irish party was promising, surprising even the initiators of the process….

As expected, the passage for some of the proposals through the Irish party gauntlet has not been smooth. The 8-hour long debate on increasing the carbon tax, for example, suggests that mixing deliberative and representative democracy still produces conflict and confusion. It is certainly clear that parliaments have to adapt and develop if citizens’ assemblies are ever to find their place in our modern democracies.

But the most encouraging move has been the simple acknowledgement that many of the barriers to implementation lie at the level of governance. The new Climate Action Commission, with a mandate to monitor climate action across government, should act as the governmental guarantor of the vision from the Citizens’ Assembly. Citizens’ proposals have themselves stimulated a review of internal government processes to stop their demands getting mired in party wrangling and government bureaucracy. By their very nature, the success of citizens’ assemblies can also provide an alternative vision of how decisions can be made – and in so doing shame political parties and parliaments into improving their decision-making practices.

Does the Irish Citizens’ Assembly constitute a case of rapid transition? In terms of its breadth, scale and vision, the experiment is impressive. But in terms of speed, deliberative processes are often criticised for being slow, unwieldly and costly. The response to this should be to ask what we’re getting: whilst an Assembly is not the most rapid vehicle for change – most serious processes take several months, if not a couple of years – the results, both in specific outcomes and in cultural or political shifts – can be astounding….

In respect to climate change, this harmony between ends and means is particularly significant. The climate crisis is the most severe collective decision-making challenge of our times, one that demands courage, but also careful thought….(More)”.

AI & Global Governance: Robots Will Not Only Wage Future Wars but also Future Peace


Daanish Masood & Martin Waehlisch at the United Nations University: “At the United Nations, we have been exploring completely different scenarios for AI: its potential to be used for the noble purposes of peace and security. This could revolutionize the way of how we prevent and solve conflicts globally.

Two of the most promising areas are Machine Learning and Natural Language Processing. Machine Learning involves computer algorithms detecting patterns from data to learn how to make predictions and recommendations. Natural Language Processing involves computers learning to understand human languages.

At the UN Secretariat, our chief concern is with how these emerging technologies can be deployed for the good of humanity to de-escalate violence and increase international stability.

This endeavor has admirable precedent. During the Cold War, computer scientists used multilayered simulations to predict the scale and potential outcome of the arms race between the East and the West.

Since then, governments and international agencies have increasingly used computational models and advanced Machine Learning to try to understand recurrent conflict patterns and forecast moments of state fragility.

But two things have transformed the scope for progress in this field.

The first is the sheer volume of data now available from what people say and do online. The second is the game-changing growth in computational capacity that allows us to crunch unprecedented, inconceivable quantities data with relative speed and ease.

So how can this help the United Nations build peace? Three ways come to mind.

Firstly, overcoming cultural and language barriers. By teaching computers to understand human language and the nuances of dialects, not only can we better link up what people write on social media to local contexts of conflict, we can also more methodically follow what people say on radio and TV. As part of the UN’s early warning efforts, this can help us detect hate speech in a place where the potential for conflict is high. This is crucial because the UN often works in countries where internet coverage is low, and where the spoken languages may not be well understood by many of its international staff.

Natural Language Processing algorithms can help to track and improve understanding of local debates, which might well be blind spots for the international community. If we combine such methods with Machine Learning chatbots, the UN could conduct large-scale digital focus groups with thousands in real-time, enabling different demographic segments in a country to voice their views on, say, a proposed peace deal – instantly testing public support, and indicating the chances of sustainability.

Secondly, anticipating the deeper drivers of conflict. We could combine new imaging techniques – whether satellites or drones – with automation. For instance, many parts of the world are experiencing severe groundwater withdrawal and water aquifer depletion. Water scarcity, in turn, drives conflicts and undermines stability in post-conflict environments, where violence around water access becomes more likely, along with large movements of people leaving newly arid areas.

One of the best predictors of water depletion is land subsidence or sinking, which can be measured by satellite and drone imagery. By combining these imaging techniques with Machine Learning, the UN can work in partnership with governments and local communities to anticipate future water conflicts and begin working proactively to reduce their likelihood.

Thirdly, advancing decision making. In the work of peace and security, it is surprising how many consequential decisions are still made solely on the basis of intuition.

Yet complex decisions often need to navigate conflicting goals and undiscovered options, against a landscape of limited information and political preference. This is where we can use Deep Learning – where a network can absorb huge amounts of public data and test it against real-world examples on which it is trained while applying with probabilistic modeling. This mathematical approach can help us to generate models of our uncertain, dynamic world with limited data.

With better data, we can eventually make better predictions to guide complex decisions. Future senior peace envoys charged with mediating a conflict would benefit from such advances to stress test elements of a peace agreement. Of course, human decision-making will remain crucial, but would be informed by more evidence-driven robust analytical tools….(More)”.

Introducing the Contractual Wheel of Data Collaboration


Blog by Andrew Young and Stefaan Verhulst: “Earlier this year we launched the Contracts for Data Collaboration (C4DC) initiative — an open collaborative with charter members from The GovLab, UN SDSN Thematic Research Network on Data and Statistics (TReNDS), University of Washington and the World Economic Forum. C4DC seeks to address the inefficiencies of developing contractual agreements for public-private data collaboration by informing and guiding those seeking to establish a data collaborative by developing and making available a shared repository of relevant contractual clauses taken from existing legal agreements. Today TReNDS published “Partnerships Founded on Trust,” a brief capturing some initial findings from the C4DC initiative.

The Contractual Wheel of Data Collaboration [beta]

The Contractual Wheel of Data Collaboration [beta] — Stefaan G. Verhulst and Andrew Young, The GovLab

As part of the C4DC effort, and to support Data Stewards in the private sector and decision-makers in the public and civil sectors seeking to establish Data Collaboratives, The GovLab developed the Contractual Wheel of Data Collaboration [beta]. The Wheel seeks to capture key elements involved in data collaboration while demystifying contracts and moving beyond the type of legalese that can create confusion and barriers to experimentation.

The Wheel was developed based on an assessment of existing legal agreements, engagement with The GovLab-facilitated Data Stewards Network, and analysis of the key elements of our Data Collaboratives Methodology. It features 22 legal considerations organized across 6 operational categories that can act as a checklist for the development of a legal agreement between parties participating in a Data Collaborative:…(More)”.