The Risks of Dangerous Dashboards in Basic Education


Lant Pritchett at the Center for Global Development: “On June 1, 2009 Air France flight 447 from Rio de Janeiro to Paris crashed into the Atlantic Ocean killing all 228 people on board. While the Airbus 330 was flying on auto-pilot, the different speed indicators received by the on-board navigation computers started to give conflicting speeds, almost certainly because the pitot tubes responsible for measuring air speed had iced over. Since the auto-pilot could not resolve conflicting signals and hence did not know how fast the plane was actually going, it turned control of the plane over to the two first officers (the captain was out of the cockpit). Subsequent flight simulator trials replicating the conditions of the flight conclude that had the pilots done nothing at all everyone would have lived—nothing was actually wrong; only the indicators were faulty, not the actual speed. But, tragically, the pilots didn’t do nothing….

What is the connection to education?

Many countries’ systems of basic education are in “stall” condition.

A recent paper of Beatty et al. (2018) uses information from the Indonesia Family Life Survey, a representative household survey that has been carried out in several waves with the same individuals since 2000 and contains information on whether individuals can answer simple arithmetic questions. Figure 1, showing the relationship between the level of schooling and the probability of answering a typical question correctly, has two shocking results.

First, the difference in the likelihood a person can answer a simple mathematics question correctly differs by only 20 percent between individuals who have completed less than primary school (<PS)—who can answer correctly (adjusted for guessing) about 20 percent of the time—and those who have completed senior secondary school or more (>=SSS), who answer correctly only about 40 percent of the time. These are simple multiple choice questions like whether 56/84 is the same fraction as (can be reduced to) 2/3, and whether 1/3-1/6 equals 1/6. This means that in an entire year of schooling, less than 2 additional children per 100 gain the ability to answer simple arithmetic questions.

Second, this incredibly poor performance in 2000 got worse by 2014. …

What has this got to do with education dashboards? The way large bureaucracies prefer to work is to specify process compliance and inputs and then measure those as a means of driving performance. This logistical mode of managing an organization works best when both process compliance and inputs are easily “observable” in the economist’s sense of easily verifiable, contractible, adjudicated. This leads to attention to processes and inputs that are “thin” in the Clifford Geertz sense (adopted by James Scott as his primary definition of how a “high modern” bureaucracy and hence the state “sees” the world). So in education one would specify easily-observable inputs like textbook availability, class size, school infrastructure. Even if one were talking about “quality” of schooling, a large bureaucracy would want this too reduced to “thin” indicators, like the fraction of teachers with a given type of formal degree, or process compliance measures, like whether teachers were hired based on some formal assessment.

Those involved in schooling can then become obsessed with their dashboards and the “thin” progress that is being tracked and easily ignore the loud warning signals saying: Stall!…(More)”.

Mapping the Privacy-Utility Tradeoff in Mobile Phone Data for Development


Paper by Alejandro Noriega-Campero, Alex Rutherford, Oren Lederman, Yves A. de Montjoye, and Alex Pentland: “Today’s age of data holds high potential to enhance the way we pursue and monitor progress in the fields of development and humanitarian action. We study the relation between data utility and privacy risk in large-scale behavioral data, focusing on mobile phone metadata as paradigmatic domain. To measure utility, we survey experts about the value of mobile phone metadata at various spatial and temporal granularity levels. To measure privacy, we propose a formal and intuitive measure of reidentification riskthe information ratioand compute it at each granularity level. Our results confirm the existence of a stark tradeoff between data utility and reidentifiability, where the most valuable datasets are also most prone to reidentification. When data is specified at ZIP-code and hourly levels, outside knowledge of only 7% of a person’s data suffices for reidentification and retrieval of the remaining 93%. In contrast, in the least valuable dataset, specified at municipality and daily levels, reidentification requires on average outside knowledge of 51%, or 31 data points, of a person’s data to retrieve the remaining 49%. Overall, our findings show that coarsening data directly erodes its value, and highlight the need for using data-coarsening, not as stand-alone mechanism, but in combination with data-sharing models that provide adjustable degrees of accountability and security….(More)”.

A rationale for data governance as an approach to tackle recurrent drawbacks in open data portals


Conference paper by Juan Ribeiro Reis et al: “Citizens and developers are gaining broad access to public data sources, made available in open data portals. These machine-readable datasets enable the creation of applications that help the population in several ways, giving them the opportunity to actively participate in governance processes, such as decision taking and policy-making.

While the number of open data portals grows over the years, researchers have been able to identify recurrent problems with the data they provide, such as lack of data standards, difficulty in data access and poor understandability. Such issues make difficult the effective use of data. Several works in literature propose different approaches to mitigate these issues, based on novel or well-known data management techniques.

However, there is a lack of general frameworks for tackling these problems. On the other hand, data governance has been applied in large companies to manage data problems, ensuring that data meets business needs and become organizational assets. In this paper, firstly, we highlight the main drawbacks pointed out in literature for government open data portals. Eventually, we bring around how data governance can tackle much of the issues identified…(More)”.

The economic value of data: discussion paper


HM Treasury (UK): “Technological change has radically increased both the volume of data in the economy, and our ability to process it. This change presents an opportunity to transform our economy and society for the better.

Data-driven innovation holds the keys to addressing some of the most significant challenges confronting modern Britain, whether that is tackling congestion and improving air quality in our cities, developing ground-breaking diagnosis systems to support our NHS, or making our businesses more productive.

The UK’s strengths in cutting-edge research and the intangible economy make it well-placed to be a world leader, and estimates suggest that data-driven technologies will contribute over £60 billion per year to the UK economy by 2020.1 Recent events have raised public questions and concerns about the way that data, and particularly personal data, can be collected, processed, and shared with third party organisations.

These are concerns that this government takes seriously. The Data Protection Act 2018 updates the UK’s world-leading data protection framework to make it fit for the future, giving individuals strong new rights over how their data is used. Alongside maintaining a secure, trusted data environment, the government has an important role to play in laying the foundations for a flourishing data-driven economy.

This means pursuing policies that improve the flow of data through our economy, and ensure that those companies who want to innovate have appropriate access to high-quality and well-maintained data.

This discussion paper describes the economic opportunity presented by data-driven innovation, and highlights some of the key challenges that government will need to address, such as: providing clarity around ownership and control of data; maintaining a strong, trusted data protection framework; making effective use of public sector data; driving interoperability and standards; and enabling safe, legal and appropriate data sharing.

Over the last few years, the government has taken significant steps to strengthen the UK’s position as a world leader in data-driven innovation, including by agreeing the Artificial Intelligence Sector Deal, establishing the Geospatial Commission, and making substantial investments in digital skills. The government will build on those strong foundations over the coming months, including by commissioning an Expert Panel on Competition in Digital Markets. This Expert Panel will support the government’s wider review of competition law by considering how competition policy can better enable innovation and support consumers in the digital economy.

There are still big questions to be answered. This document marks the beginning of a wider set of conversations that government will be holding over the coming year, as we develop a new National Data Strategy….(More)”.

Regulatory Technology – Replacing Law with Computer Code


LSE Legal Studies Working Paper by Eva Micheler and Anna Whaley: “Recently both the Bank of England and the Financial Conduct Authority have carried out experiments using new digital technology for regulatory purposes. The idea is to replace rules written in natural legal language with computer code and to use artificial intelligence for regulatory purposes.

This new way of designing public law is in line with the government’s vision for the UK to become a global leader in digital technology. It is also reflected in the FCA’s business plan.

The article reviews the technology and the advantages and disadvantages of combining the technology with regulatory law. It then informs the discussion from a broader public law perspective. It analyses regulatory technology through criteria developed in the mainstream regulatory discourse. It contributes to that discourse by anticipating problems that will arise as the technology evolves. In addition, the hope is to assist the government in avoiding mistakes that have occurred in the past and creating a better system from the start…(More)”.

Informational Autocrats


Paper by Sergei M. Guriev and Daniel Treisman: “In recent decades, dictatorships based on mass repression have largely given way to a new model based on the manipulation of information. Instead of terrorizing citizens into submission, “informational autocrats” artificially boost their popularity by convincing the public they are competent.

To do so, they use propaganda and silence informed members of the elite by co-optation or censorship.

Using several sources – including a newly created dataset of authoritarian control techniques – we document a range of trends in recent autocracies that fit the theory: a decline in violence, efforts to conceal state repression, rejection of official ideologies, imitation of democracy, a perceptions gap between masses and elite, and the adoption by leaders of a rhetoric of performance rather than one aimed at inspiring fear….(More)”

Identifying Healthcare Fraud with Open Data


Paper by Xuan Zhang et al: “Health care fraud is a serious problem that impacts every patient and consumer. This fraudulent behavior causes excessive financial losses every year and causes significant patient harm. Healthcare fraud includes health insurance fraud, fraudulent billing of insurers for services not provided, and exaggeration of medical services, etc. To identify healthcare fraud thus becomes an urgent task to avoid the abuse and waste of public funds. Existing methods in this research field usually use classified data from governments, which greatly compromises the generalizability and scope of application. This paper introduces a methodology to use publicly available data sources to identify potentially fraudulent behavior among physicians. The research involved data pairing of multiple datasets, selection of useful features, comparisons of classification models, and analysis of useful predictors. Our performance evaluation results clearly demonstrate the efficacy of the proposed method….(More)”.

Open innovation and the evaluation of internet-enabled public services in smart cities


Krassimira Paskaleva and Ian Cooper in Technovation: This article is focused on public service innovation from an innovation management perspective. It presents research experience gained from a European project for managing social and technological innovation in the production and evaluation of citizen-centred internet-enabled services in the public sector.

It is based on six urban pilot initiatives, which sought to operationalise a new approach to co-producing and co-evaluating civic services in smart cities – commonly referred to as open innovation for smart city services. Research suggests that the evidence base underpinning this approach is not sufficiently robust to support claims being made about its effectiveness.

Instead evaluation research of citizen-centred internet-enabled urban services is in its infancy and there are no tested methods or tools in the literature for supporting this approach.

The paper reports on the development and trialing of a novel Co-evaluation Framework, indicators and reporting categories, used to support the co-production of smart city services in an EU-funded project. Our point of departure is that innovation of services is a sub-set of innovation management that requires effective integration of technological with social innovation, supported by the right skills and capacities. The main skills sets needed for effective co-evaluation of open innovation services are the integration of stakeholder management with evaluation capacities.”

Big Data: the End of the Scientific Method?


Paper by S. Succi and P.V. Coveney at arXiv: “We argue that the boldest claims of Big Data are in need of revision and toning-down, in view of a few basic lessons learned from the science of complex systems. We point out that, once the most extravagant claims of Big Data are properly discarded, a synergistic merging of BD with big theory offers considerable potential to spawn a new scientific paradigm capable of overcoming some of the major barriers confronted by the modern scientific method originating with Galileo. These obstacles are due to the presence of nonlinearity, nonlocality and hyperdimensions which one encounters frequently in multiscale modelling….(More)”.

How to be a public entrepreneur


Rowan Conway at the RSA: “Political theorist Elinor Ostrom was the first to coin the phrase “public entrepreneur” in her 1965 UCLA PhD thesis where she proposed that government actors should be the makers of purpose-driven businesses. She later went on to surprise the world of economics by winning a Nobel prize.

To the economic establishment Ostrom was a social scientist and her theories of common goods and public purpose enterprise ran counter to the economic orthodoxy. 44 years later, at the same time that she was taking the stage as the first (and only) woman to win a Nobel prize for economics, another California-based thinker was positing his own vision for entrepreneurship… “Move fast and break things” was famously Mark Zuckerberg’s credo for Silicon Valley entrepreneurs. “Unless you are breaking stuff,” he said in 2009, “you are not moving fast enough.” This phrase came to epitomise the “fail fast” start-up culture that has seeped into our consciousness and redefined modern life in the last decade.

Public vs Private entrepreneurs

So which of these two types of entrepreneurship should prevail? I’d say that they’re not playing on the same field and barely even playing the same game. While the Silicon Valley model glorifies the frat boys who dreamt up tech start-ups in their dorm rooms and took the “self-made” financial gains when big tech took off, public entrepreneurs are not cast from this mold. They are the government actors taking on the system to solve social and environmental problems and the idea of “breaking things” won’t appeal to them. “Moving fast”, however, speaks to their ambitions for an agile government that wants to make change in a digital world.

Public entrepreneurs are socially minded — but they differ from social entrepreneurs in that they carry out a public or state role. In a Centre for Public Impact briefing paper entitled “Enter the Public Entrepreneur” the difference is clear:

“While “social entrepreneurs” are people outside government, public entrepreneurs act within government and, at their heart, are a blend of two different roles: that of a public servant, and that of an entrepreneur. The underlying premise is that these roles are usually distinct but the skill sets they require need not be. Indeed, the future public servant will increasingly need to think and act like an entrepreneur — building new relationships, leveraging resources, working across sector lines and acting, and sometimes failing, fast.”

Today we publish a RSA Lab report entitled “Move Fast and Fix Things” in partnership with Innovate UK. The report examines the role of Public Entrepreneurs who want to find ways to move fast without leaving a trail of destruction. It builds on the literature that makes the case for public missionsand entrepreneurship in government and acts as a kind of “how to guide” for those in the public sector who want to think and act like entrepreneurs, but sometimes feel like they are pushing up against an immovable bureaucratic system.

Acting entrepreneurially with procurement

A useful distinction of types of government innovation by the European Commission describes “innovation in government” as transforming public administration, such as the shift to digital service provision and “innovation through government” as initiatives that “foster innovation elsewhere in society, such as the public procurement of innovation”. Our report looks at public procurement — specifically the Small Business Research Initiative (SBRI) — as a route for innovation through government.

Governments have catalytic spending power. The UK public sector alone spends over £251.5 billion annually procuring goods and services which accounts for 33% of public sector spend and 13.7% of GDP. A profound shift in practice is required if government is to proactively use this power to stimulate innovation in the way that Mariana Mazzucato, author of The Entrepreneurial State calls for. As Director of the UCL Institute for Innovation and Public Purpose she advocates for “mission-oriented innovation” which can enable speed as it has “not only a rate, but also a direction” — purposefully using government’s purchasing power to stimulate innovation for good.

But getting procurement professionals to understand how to be entrepreneurial with public funds is no mean feat….(More)”.