Explore our articles

Stefaan Verhulst

New Report by the OECD: “The 2017 volume of the  Development Co-operation Report focuses on Data for Development. “Big Data” and “the Internet of Things” are more than buzzwords: the data revolution is transforming the way that economies and societies are functioning across the planet. The Sustainable Development Goals along with the data revolution are opportunities that should not be missed: more and better data can help boost inclusive growth, fight inequalities and combat climate change. These data are also essential to measure and monitor progress against the Sustainable Development Goals.

The value of data in enabling development is uncontested. Yet, there continue to be worrying gaps in basic data about people and the planet and weak capacity in developing countries to produce the data that policy makers need to deliver reforms and policies that achieve real, visible and long-lasting development results. At the same time, investing in building statistical capacity – which represented about 0.30% of ODA in 2015 – is not a priority for most providers of development assistance.

There is a need for stronger political leadership, greater investment and more collective action to bridge the data divide for development. With the unfolding data revolution, developing countries and donors have a unique chance to act now to boost data production and use for the benefit of citizens. This report sets out priority actions and good practices that will help policy makers and providers of development assistance to bridge the global data divide, notably by strengthening statistical systems in developing countries to produce better data for better policies and better lives….(More)”

Data for Development

Barnabas Imre Szaszi et al in the Journal of Behavioral Decision Making: “In this paper, we provide a domain-general scoping review of the nudge movement by reviewing 422 choice architecture interventions in 156 empirical studies. We report the distribution of the studies across countries, years, domains, subdomains of applicability, intervention types, and the moderators associated with each intervention category to review the current state of the nudge movement. Furthermore, we highlight certain characteristics of the studies and experimental and reporting practices which can hinder the accumulation of evidence in the field. Specifically, we found that 74 % of the studies were mainly motivated to assess the effectiveness of the interventions in one specific setting, while only 24% of the studies focused on the exploration of moderators or underlying processes. We also observed that only 7% of the studies applied power analysis, 2% used guidelines aiming to improve the quality of reporting, no study in our database was preregistered, and the used intervention nomenclatures were non-exhaustive and often have overlapping categories. Building on our current observations and proposed solutions from other fields, we provide directly applicable recommendations for future research to support the evidence accumulation on why and when nudges work….(More)”.

A Systematic Scoping Review of the Choice Architecture Movement: Towards Understanding When and Why Nudges Work

Mark Thompson at Computer Weekly: “Discussion around reforming public services is as important as better information sharing rules if government is to make the most of public data…

Our public services face two paradoxes in relation to data sharing. First, on the demand side, “Zuckerberg’s law” – which claims that the amount of data we’re happy to share with companies increases exponentially year-on-year – flies in the face of our wariness as citizens to share with the state….

The upcoming General Data Protection Regulation (GDPR) – a beefed-up version of the existing Data Protection Act (DPA) – is likely to only exacerbate a fundamental problem, therefore: citizens don’t want the state to know much about them, and public servants don’t want to share. Each behaviour is paradoxical, and thus complex to address culturally.

Worse, we need to accelerate our public conversation considerably if we are to maintain pace with accelerating technological developments.

Existing complexity in the data space will shortly be exacerbated by new abilities to process unstructured data such as images and natural language – abilities which offer entirely new opportunities for commercial exploitation as well as surveillance…(More)”.

Open data, democracy and public service reform

Paper by David G. Robinson: “Government authorities at all levels increasingly rely on automated predictions, grounded in statistical patterns, to shape people’s lives. Software that wields government power deserves special attention, particularly when it uses historical data to decide automatically what ought to happen next.

In this article, I draw examples primarily from the domain of criminal justice — and in particular, the intersection of civil rights and criminal justice — to illustrate three structural challenges that can arise whenever law or public policy contemplates adopting predictive analytics as a tool:

1) What matters versus what the data measure;
2) Current goals versus historical patterns; and
3) Public authority versus private expertise.

After explaining each of these challenges and illustrating each with concrete examples, I describe feasible ways to avoid these problems and to do prediction more successfully…(More)”

The Challenges of Prediction: Lessons from Criminal Justice

New paper by Praneetha Vissapragada and Naomi Joswiak: “The Open Government Costing initiative, seeded with funding from the World Bank, was undertaken to develop a practical and actionable approach to pinpointing the full economic costs of various open government programs. The methodology developed through this initiative represents an important step towards conducting more sophisticated cost-benefit analyses – and ultimately understanding the true value – of open government reforms intended to increase citizen engagement, promote transparency and accountability, and combat corruption, insights that have been sorely lacking in the open government community to date. The Open Government Costing Framework and Methods section (Section 2 of this report) outlines the critical components needed to conduct cost analysis of open government programs, with the ultimate objective of putting a price tag on key open government reform programs in various countries at a particular point in time. This framework introduces a costing process that employs six essential steps for conducting a cost study, including (1) defining the scope of the program, (2) identifying types of costs to assess, (3) developing a framework for costing, (4) identifying key components, (5) conducting data collection and (6) conducting data analysis. While the costing methods are built on related approaches used for analysis in other sectors such as health and nutrition, this framework and methodology was specifically adapted for open government programs and thus addresses the unique challenges associated with these types of initiatives. Using the methods outlined in this document, we conducted a cost analysis of two case studies: (1) ProZorro, an e-procurement program in Ukraine; and (2) Sierra Leone’s Open Data Program….(More)”

Priceless? A new framework for estimating the cost of open government reforms

 at FiveThirtyEight: “The Supreme Court does not compute. Or at least some of its members would rather not. The justices, the most powerful jurists in the land, seem to have a reluctance — even an allergy — to taking math and statistics seriously.

For decades, the court has struggled with quantitative evidence of all kinds in a wide variety of cases. Sometimes justices ignore this evidence. Sometimes they misinterpret it. And sometimes they cast it aside in order to hold on to more traditional legal arguments. (And, yes, sometimes they also listen to the numbers.) Yet the world itself is becoming more computationally driven, and some of those computations will need to be adjudicated before long. Some major artificial intelligence case will likely come across the court’s desk in the next decade, for example. By voicing an unwillingness to engage with data-driven empiricism, justices — and thus the court — are at risk of making decisions without fully grappling with the evidence.

This problem was on full display earlier this month, when the Supreme Court heard arguments in Gill v. Whitford, a case that will determine the future of partisan gerrymandering — and the contours of American democracy along with it. As my colleague Galen Druke has reported, the case hinges on math: Is there a way to measure a map’s partisan bias and to create a standard for when a gerrymandered map infringes on voters’ rights?…(More)”.

The Supreme Court Is Allergic To Math
Essay by Joseph E. Stiglitz, Dean Baker and Arjun Jayadev: “Developing countries are increasingly pushing back against the intellectual property regime foisted on them by the advanced economies over the last 30 years. They are right to do so, because what matters is not only the production of knowledge, but also that it is used in ways that put the health and wellbeing of people ahead of corporate profits….When the South African government attempted to amend its laws in 1997 to avail itself of affordable generic medicines for the treatment of HIV/AIDS, the full legal might of the global pharmaceutical industry bore down on the country, delaying implementation and extracting a high human cost. South Africa eventually won its case, but the government learned its lesson: it did not try again to put its citizens’ health and wellbeing into its own hands by challenging the conventional global intellectual property (IP) regime….(More)”.

Intellectual Property for the Twenty-First-Century Economy

Manuel Pedro Rodríguez Bolívar and Laura Alcaide Muñoz in the International Journal of Public Administration in the Digital Age: “The growing participation in social networking sites is altering the nature of social relations and changing the nature of political and public dialogue. This paper aims to contribute to the current debate on Web 2.0 technologies and their implications for local governance, through the identification of the perceptions of policy makers in local governments on the use of Web 2.0 in providing public services (reasons, advantages and risks) and on the change of the roles that these technologies could provoke in interactions between local governments and their stakeholders (governance models). This paper also analyzes whether the municipal size is a main factor that could influence on the policy makers’ perceptions regarding these main topics. Findings suggest that policy makers are willing to implement Web 2.0 technologies in providing public services, but preferably under the Bureaucratic model framework, thus retaining a leading role in this implementation. The municipal size is a factor that could influence on policy makers’ perceptions….(More)”.

Political Ideology and Municipal Size as Incentives for the Implementation and Governance Models of Web 2.0 in Providing Public Services

Paper by Sara Makki et al: “Fraudulent activities (e.g., suspicious credit card transaction, financial reporting fraud, and money laundering) are critical concerns to various entities including bank, insurance companies, and public service organizations. Typically, these activities lead to detrimental effects on the victims such as a financial loss. Over the years, fraud analysis techniques underwent a rigorous development. However, lately, the advent of Big data led to vigorous advancement of these techniques since Big Data resulted in extensive opportunities to combat financial frauds. Given that the massive amount of data that investigators need to sift through, massive volumes of data integrated from multiple heterogeneous sources (e.g., social media, blogs) to find fraudulent patterns is emerging as a feasible approach….(More)”.

Fraud Data Analytics Tools and Techniques in Big Data Era

Dom Galeon in Futurism: “As artificial intelligence (AI) development progresses, experts have begun considering how best to give an AI system an ethical or moral backbone. A popular idea is to teach AI to behave ethically by learning from decisions made by the average person.

To test this assumption, researchers from MIT created the Moral Machine. Visitors to the website were asked to make choices regarding what an autonomous vehicle should do when faced with rather gruesome scenarios. For example, if a driverless car was being forced toward pedestrians, should it run over three adults to spare two children? Save a pregnant woman at the expense of an elderly man?

The Moral Machine was able to collect a huge swath of this data from random people, so Ariel Procaccia from Carnegie Mellon University’s computer science department decided to put that data to work.

In a new study published online, he and Iyad Rahwan — one of the researchers behind the Moral Machine — taught an AI using the Moral Machine’s dataset. Then, they asked the system to predict how humans would want a self-driving car to react in similar but previously untested scenarios….

This idea of having to choose between two morally problematic outcomes isn’t new. Ethicists even have a name for it: the double-effect. However, having to apply the concept to an artificially intelligent system is something humankind has never had to do before, and numerous experts have shared their opinions on how best to go about it.

OpenAI co-chairman Elon Musk believes that creating an ethical AI is a matter of coming up with clear guidelines or policies to govern development, and governments and institutions are slowly heeding Musk’s call. Germany, for example, crafted the world’s first ethical guidelines for self-driving cars. Meanwhile, Google parent company Alphabet’s AI DeepMind now has an ethics and society unit.

Other experts, including a team of researchers from Duke University, think that the best way to move forward is to create a “general framework” that describes how AI will make ethical decisions….(More)”.

Crowdsourced Morality Could Determine the Ethics of Artificial Intelligence

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday