‘Frontier methods’ offer a powerful but accessible approach for measuring the efficiency of public sector organisations


EUROPP Blog of the LSE: “How can the efficiency of public sector organisations best be measured? Jesse Stroobants and Geert Bouckaert write that while the efficiency of an organisation is typically measured using performance indicators, there are some notable problems with this approach, such as the tendency for different indicators to produce conflicting conclusions on organisational performance. As an alternative, they outline so called ‘frontier methods’, which use direct comparisons between different organisations to create a benchmark or standard for performance. They argue that the frontier approach not only alleviates some of the problems associated with performance indicators, but is also broadly accessible for those employed in public administration….
However, despite their merits, there are some drawbacks to using performance indicators. First, they provide only an indirect or partial indication of performance. For instance with respect to efficiency, indicators will be single-input/single-output indicators. Second, they may provide conflicting results: an organisation that appears to do well on one indicator may perform less successfully when considered using another.
In this context, ‘frontier methods’ offer alternative techniques for measuring and evaluating the performance of a group of comparable entities. Unlike single factor measures that reflect only partial aspects of performance, frontier techniques can be applied to assess overall performance by handling multiple inputs and outputs at the same time. Specifically, Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH) have proven to be useful tools for assessing the relative efficiency of entities….
At this point you may be thinking that the term ‘frontier methods’ sounds overly complex or that these techniques are only likely to be of any use to academic specialists. Yet there are a number of reasons why this interpretation would be incorrect. It is indeed true that DEA and FDH have been used predominantly by economists and econometricians, and only rarely by those employed in public administration. We should re-establish this bridge. Therefore, in a recent article, we have provided a step-by-step application of DEA/FDH to benchmark the efficiency of comparable public sector organisations (in the article’s case: public libraries in Flanders). With this gradual approach, we want to offer both academics and practitioners a basic grounding in more advanced efficiency measurement techniques….(More)”.

Mastering ’Metrics: The Path from Cause to Effect


Book by Joshua D. Angrist & Jörn-Steffen Pischke : “Applied econometrics, known to aficionados as ‘metrics, is the original data science. ‘Metrics encompasses the statistical methods economists use to untangle cause and effect in human affairs. Through accessible discussion and with a dose of kung fu–themed humor, Mastering ‘Metrics presents the essential tools of econometric research and demonstrates why econometrics is exciting and useful.
The five most valuable econometric methods, or what the authors call the Furious Five–random assignment, regression, instrumental variables, regression discontinuity designs, and differences in differences–are illustrated through well-crafted real-world examples (vetted for awesomeness by Kung Fu Panda’s Jade Palace). Does health insurance make you healthier? Randomized experiments provide answers. Are expensive private colleges and selective public high schools better than more pedestrian institutions? Regression analysis and a regression discontinuity design reveal the surprising truth. When private banks teeter, and depositors take their money and run, should central banks step in to save them? Differences-in-differences analysis of a Depression-era banking crisis offers a response. Could arresting O. J. Simpson have saved his ex-wife’s life? Instrumental variables methods instruct law enforcement authorities in how best to respond to domestic abuse….(More).”

Wikipedia and the Politics of Openness


New book by Nathaniel Tkacz: “Few virtues are as celebrated in contemporary culture as openness. Rooted in software culture and carrying more than a whiff of Silicon Valley technical utopianism, openness—of decision-making, data, and organizational structure—is seen as the cure for many problems in politics and business.
But what does openness mean, and what would a political theory of openness look like? With Wikipedia and the Politics of Openness, Nathaniel Tkacz uses Wikipedia, the most prominent product of open organization, to analyze the theory and politics of openness in practice—and to break its spell. Through discussions of edit wars, article deletion policies, user access levels, and more, Tkacz enables us to see how the key concepts of openness—including collaboration, ad-hocracy, and the splitting of contested projects through “forking”—play out in reality.
The resulting book is the richest critical analysis of openness to date, one that roots media theory in messy reality and thereby helps us move beyond the vaporware promises of digital utopians and take the first steps toward truly understanding what openness does, and does not, have to offer….(More).”

The 18F Hub


18F Blog: “Clear, organized, and easy access to information is critical to supporting team growth and promoting the kind of culture change that 18F, the U.S. Digital Service, and fellow Innovators and Early Adopters aim to produce throughout federal IT development. As a small step towards that goal, we’d like to announce the 18F Hub, a Jekyll-based documentation platform that aims to help development teams organize and easily share their information, and to enable easy exploration of the connections between team members, projects, and skill sets. It’s still very much alpha-stage software, but over time, we’ll incrementally improve it until it serves as the go-to place for all our team’s working information, whether that information is integrated into the Hub directly or provided as links to other sources. It also serves as a lightweight tool that other teams can experiment with and deploy with a minimum of setup.

While we at 18F strongly believe in the value of transparency and collaboration across government, the details of our team, our projects, and our activities aren’t the real reason we’re launching the Hub: the exposure of our domain knowledge, working models and processes through tangible artifacts that people can adapt to their own environments is… (More).

Governments and Citizens Getting to Know Each Other? Open, Closed, and Big Data in Public Management Reform


New paper by Amanda Clarke and Helen Margetts in Policy and Internet: “Citizens and governments live increasingly digital lives, leaving trails of digital data that have the potential to support unprecedented levels of mutual government–citizen understanding, and in turn, vast improvements to public policies and services. Open data and open government initiatives promise to “open up” government operations to citizens. New forms of “big data” analysis can be used by government itself to understand citizens’ behavior and reveal the strengths and weaknesses of policy and service delivery. In practice, however, open data emerges as a reform development directed to a range of goals, including the stimulation of economic development, and not strictly transparency or public service improvement. Meanwhile, governments have been slow to capitalize on the potential of big data, while the largest data they do collect remain “closed” and under-exploited within the confines of intelligence agencies. Drawing on interviews with civil servants and researchers in Canada, the United Kingdom, and the United States between 2011 and 2014, this article argues that a big data approach could offer the greatest potential as a vehicle for improving mutual government–citizen understanding, thus embodying the core tenets of Digital Era Governance, argued by some authors to be the most viable public management model for the digital age (Dunleavy, Margetts, Bastow, & Tinkler, 2005, 2006; Margetts & Dunleavy, 2013).”
 

Governing the Embedded State: The Organizational Dimension of Governance


Book by Bengt Jacobsson, Jon Pierre, and Göran Sundström:Governing the Embedded State integrates governance theory with organization theory and examines how states address social complexity and international embeddedness. Drawing upon extensive empirical research on the Swedish government system, this volume describes a strategy of governance based in a metagovernance model of steering by designing institutional structures. This strategy is supplemented by micro-steering of administrative structures within the path dependencies put in place through metagovernance. Both of these strategies of steering rely on subtle methods of providing political guidance to the public service where norms of loyalty to the government characterize the relationship between politicians and civil servants.

By drawing upon this research, the volume will explain how recent developments such as globalization, Europeanization, the expansion of managerial ideas, and the fragmentation of states, have influenced the state’s capacity to govern.
The result is an account of contemporary governance which shows the societal constraints on government but also the significance of close interaction and cooperation between the political leadership and the senior civil servants in addressing those constraints.”

People around you control your mind: The latest evidence


in the Washington Post: “…That’s the power of peer pressure.In a recent working paper, Pedro Gardete looked at 65,525 transactions across 1,966 flights and more than 257,000 passengers. He parsed the data into thousands of mini-experiments such as this:

If someone beside you ordered a snack or a film, Gardete was able to see whether later you did, too. In this natural experiment, the person sitting directly in front of you was the control subject. Purchases were made on a touchscreen; that person wouldn’t have been able to see anything. If you bought something, and the person in front of you didn’t, peer pressure may have been the reason.
Because he had reservation data, Gardete could exclude people flying together, and he controlled for all kinds of other factors such as seat choice. This is purely the effect of a stranger’s choice — not just that, but a stranger whom you might be resenting because he is sitting next to you, and this is a plane.
By adding up thousands of these little experiments, Gardete, an assistant professor of marketing at Stanford, came up with an estimate. On average, people bought stuff 15 to 16 percent of the time. But if you saw someone next to you order something, your chances of buying something, too, jumped by 30 percent, or about four percentage points…
The beauty of this paper is that it looks at social influences in a controlled situation. (What’s more of a trap than an airplane seat?) These natural experiments are hard to come by.
Economists and social scientists have long wondered about the power of peer pressure, but it’s one of the trickiest research problems….(More)”.

The problem with the data revolution in four Venn diagrams


Morten Jerven in The Guardian: “In August, UN secretary-general Ban Ki Moon named his independent expert advisory group, 24 experts tasked with providing recommendations on how best to use data to deliver the sustainable development goals….On 6 November, those recommendations were published in a report entitled A world that counts, a cleverly crafted motivational manifesto, but by no means a practical roadmap on how to apply a “data revolution” to the future development agenda.
I have previously written about this in more detail, but essentially, the report’s key weakness is that it conflates several terms, and assumes automatic relationships between things such as “counting” and “knowing”.
Using four Venn diagrams, I’ve tried to illustrate some of the main misconceptions.

Not everything that counts can be counted

venn 1
Image 1

The report strongly suggests that everything that matters can be counted. We know that this is not true. If the guiding principle for the sustainable development goals is to make decisions as if everything can be counted, the end result will be very misleading.

Data is not the same as statistics

venn 2
Image 2

The “data revolution” hype is just one of many places where the difference between statistics and data is misunderstood. Data is not the same as numbers. Data literally mean ‘what is given’, so when we speak of data we are talking about observations – quantitative or qualitative, or even figurative – that can be used to get information.
To keep talking about data when we mean statistics may sound better, but it only leads to confusion. The report (pdf) calls on the UN to establish “a process whereby key stakeholders create a Global Consensus on Data”. What is that supposed to mean? That statement is meaningless if you exchange the word “data” with “observations”, “knowledge” or ‘evidence’. It can, however, make sense if you talk about “statistics”.
International organisations do have a natural role when it comes to developing global standards for official statistics. Reaching a global consensus on how observations and evidence constitute knowledge is futile.

More data does not mean better decisions…

There are other methods to knowing than through counting…(More)”

 

Uncle Sam Wants You…To Crowdsource Science


at Co-Labs: “It’s not just for the private sector anymore: Government scientists are embracing crowdsourcing. At a White House-sponsored workshop in late November, representatives from more than 20 different federal agencies gathered to figure out how to integrate crowdsourcing and citizen scientists into various government efforts. The workshop is part of a bigger effort with a lofty goal: Building a set of best practices for the thousands of citizens who are helping federal agencies gather data, from the Environmental Protection Agency (EPA) to NASA….Perhaps the best known federal government crowdsourcing project is Nature’s Notebook, a collaboration between the U.S. Geological Survey and the National Park Service which asks ordinary citizens to take notes on plant and animal species during different times of year. These notes are then cleansed and collated into a massive database on animal and plant phenology that’s used for decision-making by national and local governments. The bulk of the observations, recorded through smartphone apps, are made by ordinary people who spend a lot of time outdoors….Dozens of government agencies are now asking the public for help. The Centers for Disease Control and Prevention runs a student-oriented, Mechanical Turk-style “micro-volunteering” service called CDCology, the VA crowdsources design of apps for homeless veterans, while the National Weather Service distributes a mobile app called mPING that asks ordinary citizens to help fine-tune public weather reports by giving information on local conditions. The Federal Communication Commission’s Measuring Broadband America app, meanwhile, allows citizens to volunteer information on their Internet broadband speeds, and the Environmental Protection Agency’s Air Sensor Toolbox asks users to track local air pollution….
As of now, however, when it comes to crowdsourcing data for government scientific research, there’s no unified set of standards or best practices. This can lead to wild variations in how various agencies collect data and use it. For officials hoping to implement citizen science projects within government, the roadblocks to crowdsourcing include factors that crowdsourcing is intended to avoid: limited budgets, heavy bureaucracy, and superiors who are skeptical about the value of relying on the crowd for data.
Benforado and Shanley also pointed out that government agencies are subject to additional regulations, such as the Paperwork Reduction Act, which can make implementation of crowdsourcing projects more challenging than they would be in academia or the private sector… (More)”

Sowing the seed: Incentives and Motivations for Sharing Research Data, a researcher's perspective


Knowledge Exchange: “This qualitative study, commissioned by Knowledge Exchange, has gathered evidence, examples and opinions on current and future incentives for research data sharing from the researchers’ point of view, in order to provide recommendations for policy and practice development on how best to incentivize data access and re-use.
Incentives and motivations ask for development of a data infrastructure with rich context where research data, papers and other outputs or resources are jointly available within a single data resource. Different types of data sharing and research disciplines need to be acknowledged.
This study helps stakeholders to understand and act.
You can download the full study in PDF format right here