A solution to the single-question crowd wisdom problem


Dražen Prelec,H. Sebastian Seung & John McCoy in Nature: “Once considered provocative, the notion that the wisdom of the crowd is superior to any individual has become itself a piece of crowd wisdom, leading to speculation that online voting may soon put credentialed experts out of business. Recent applications include political and economic forecasting, evaluating nuclear safety, public policy, the quality of chemical probes, and possible responses to a restless volcano. Algorithms for extracting wisdom from the crowd are typically based on a democratic voting procedure. They are simple to apply and preserve the independence of personal judgment. However, democratic methods have serious limitations. They are biased for shallow, lowest common denominator information, at the expense of novel or specialized knowledge that is not widely shared. Adjustments based on measuring confidence do not solve this problem reliably. Here we propose the following alternative to a democratic vote: select the answer that is more popular than people predict. We show that this principle yields the best answer under reasonable assumptions about voter behaviour, while the standard ‘most popular’ or ‘most confident’ principles fail under exactly those same assumptions. Like traditional voting, the principle accepts unique problems, such as panel decisions about scientific or artistic merit, and legal or historical disputes. The potential application domain is thus broader than that covered by machine learning and psychometric methods, which require data across multiple questions…(More).

In Beta: Is policymaking stuck in the 19th century?


Global Partners Digital: “Today we’re launching a new series of podcasts – titled In beta – with the aim of critically examining the big questions facing human rights in the digital environment.

The series will be hosted by GPD’s executive director, Charles Bradley, who will interview a different guest – or guests – for each episode.

But before we go into details, a little more on the concept. We’ve created In beta because we felt that there weren’t enough forums for genuine debate and discussion within the digital rights community. We felt that we needed a space where we could host interesting conversations with interesting people in our field, outside of the conventions of traditional policy discourse; which can sometimes work to confine people in silos, and discourage more open, experimental thinking.

The series is called In beta because these conversations will be speculative, not definitive. The questions we examine won’t be easy – or even possible – to answer. They may sometimes be provocative. They may themselves raise new questions, and perhaps lay the groundwork for future work.

In the first episode, we talk to the c0-founder of GovLab, Stefaan Verhulst, asking – ‘Is policymaking stuck in the 19th century?’…(More)”

The Paradox of Community Power: Cultural Processes and Elite Authority in Participatory Governance


Jeremy R. Levine in Social Forces: “From town halls to public forums, disadvantaged neighborhoods appear more “participatory” than ever. Yet increased participation has not necessarily resulted in increased influence. This article, drawing on a four-year ethnographic study of redevelopment politics in Boston, presents an explanation for the decoupling of participation from the promise of democratic decision-making. I find that poor urban residents gain the appearance of power and status by invoking and policing membership in “the community”—a boundary sometimes, though not always, implicitly defined by race. But this appearance of power is largely an illusion. In public meetings, government officials can reinforce their authority and disempower residents by exploiting the fact that the boundary demarcating “the community” lacks a standardized definition. When officials laud “the community” as an abstract ideal rather than a specific group of people, they reduce “the community process” to a bureaucratic procedure. Residents appear empowered, while officials retain ultimate decision-making authority. I use the tools of cultural sociology to make sense of these findings and conclude with implications for the study of participatory governance and urban inequality….(More)”.

Data in public health


Jeremy Berg in Science: “In 1854, physician John Snow helped curtail a cholera outbreak in a London neighborhood by mapping cases and identifying a central public water pump as the potential source. This event is considered by many to represent the founding of modern epidemiology. Data and analysis play an increasingly important role in public health today. This can be illustrated by examining the rise in the prevalence of autism spectrum disorders (ASDs), where data from varied sources highlight potential factors while ruling out others, such as childhood vaccines, facilitating wise policy choices…. A collaboration between the research community, a patient advocacy group, and a technology company (www.mss.ng) seeks to sequence the genomes of 10,000 well-phenotyped individuals from families affected by ASD, making the data freely available to researchers. Studies to date have confirmed that the genetics of autism are extremely complicated—a small number of genomic variations are closely associated with ASD, but many other variations have much lower predictive power. More than half of siblings, each of whom has ASD, have different ASD-associated variations. Future studies, facilitated by an open data approach, will no doubt help advance our understanding of this complex disorder….

A new data collection strategy was reported in 2013 to examine contagious diseases across the United States, including the impact of vaccines. Researchers digitized all available city and state notifiable disease data from 1888 to 2011, mostly from hard-copy sources. Information corresponding to nearly 88 million cases has been stored in a database that is open to interested parties without restriction (www.tycho.pitt.edu). Analyses of these data revealed that vaccine development and systematic vaccination programs have led to dramatic reductions in the number of cases. Overall, it is estimated that ∼100 million cases of serious childhood diseases have been prevented through these vaccination programs.

These examples illustrate how data collection and sharing through publication and other innovative means can drive research progress on major public health challenges. Such evidence, particularly on large populations, can help researchers and policy-makers move beyond anecdotes—which can be personally compelling, but often misleading—for the good of individuals and society….(More)”

Chile’s ‘Uber of Recycling’ Is Sparking a Recycling Revolution


Tomas Urbina at Motherboard: “In 2015, after finishing a soccer game in Chile’s capital, Santiago, engineering student Cristián Lara and his friends noticed an older man picking through a dumpster nearby. He was searching for anything that could be recycled, and loading it onto his bike.

“It looked like incredibly hard work,” Lara recalled. After talking to the man, it turns out he had been doing the same work for 10 years, and was still living in poverty.

The encounter gave Lara an idea. What if there was a way to connect the collector on the street directly to the massive waste streams that exist in Chile, and to the companies that pay decent money for recyclables?

“We knew we had to do something,” said 24-year-old Lara. That’s how a recycling app startup, called ReciclApp, was born. The app launched last August. Since then, the bearded young entrepreneur has been on a mission. Standing in their section of an open collaborative workspace on the fifth floor of the luminous new innovation centre at Santiago’s Catholic University, Lara let his glee shine through in his elevator pitch for the app.

“It’s the Uber of recycling,” he said.

It works like this: individuals, businesses, and institutions download the free app. Once they have cans, boxes or bottles to get rid of, they declare specific numbers in the app and choose a date and time period for pickup. From that data, the company creates and prints out routes for the collectors they work with. There are now an average of 200 collectors working with ReciclApp across Chile, and about 1,000 app users in the country.

For collectors, it’s an efficient route with guaranteed recyclables, and they keep all the money they make. Lara’s team cuts out the middleman transporters who would previously take the material to large recycling companies. ReciclApp even has designated storage centres where collectors can leave material before a truck from large recyclers shows up….

Lara estimates that there are about 100,000 people trying to earn money from recycling in Chile. Those that work with ReciclApp have more than doubled their recycling earnings on average from about $100 USD per month to $250 USD. But even that, Lara admitted, is a small gain when you consider Chile’s high cost of living….

ReciclApp intends to change that. “We’re going to start hiring waste collectors, so they’ll have a set wage, a schedule, and can earn extra income based on how much they collect and how many homes or businesses they visit,” said ReciclApp’s director of operations, 25-year-old Manuel Fonseca….

For Fuentes, 40, the biggest improvement is how she’s treated. “Families value us as workers now, not as the lady who asks for donations and picks through the garbage,” she said. “We spent too many years hidden in the shadows. I feel different now. I’m not embarrassed of my work the way I used to be.”….(More)”

Social Media for Government


Book by Gohar Feroz Khan: “This book provides practical know-how on understanding, implementing, and managing main stream social media tools (e.g., blogs and micro-blogs, social network sites, and content communities) from a public sector perspective. Through social media, government organizations can inform citizens, promote their services, seek public views and feedback, and monitor satisfaction with the services they offer so as to improve their quality. Given the exponential growth of social media in contemporary society, it has become an essential tool for communication, content sharing, and collaboration. This growth and these tools also present an unparalleled opportunity to implement a transparent, open, and collaborative government.  However, many government organization, particularly those in the developing world, are still somewhat reluctant to leverage social media, as it requires significant policy and governance changes, as well as specific know-how, skills and resources to plan, implement and manage social media tools. As a result, governments around the world ignore or mishandle the opportunities and threats presented by social media. To help policy makers and governments implement a social media driven government, this book provides guidance in developing an effective social media policy and strategy. It also addresses issues such as those related to security and privacy….(More)”

Human Decisions and Machine Predictions


NBER Working Paper by Jon Kleinberg, Himabindu Lakkaraju, Jure Leskovec, Jens Ludwig, and Sendhil Mullainatha: “We examine how machine learning can be used to improve and understand human decision-making. In particular, we focus on a decision that has important policy consequences. Millions of times each year, judges must decide where defendants will await trial—at home or in jail. By law, this decision hinges on the judge’s prediction of what the defendant would do if released. This is a promising machine learning application because it is a concrete prediction task for which there is a large volume of data available. Yet comparing the algorithm to the judge proves complicated. First, the data are themselves generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the single variable that the algorithm focuses on; for instance, judges may care about racial inequities or about specific crimes (such as violent crimes) rather than just overall crime risk. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: a policy simulation shows crime can be reduced by up to 24.8% with no change in jailing rates, or jail populations can be reduced by 42.0% with no increase in crime rates. Moreover, we see reductions in all categories of crime, including violent ones. Importantly, such gains can be had while also significantly reducing the percentage of African-Americans and Hispanics in jail. We find similar results in a national dataset as well. In addition, by focusing the algorithm on predicting judges’ decisions, rather than defendant behavior, we gain some insight into decision-making: a key problem appears to be that judges to respond to ‘noise’ as if it were signal. These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals….(More)”

Using Algorithms To Predict Gentrification


Tanvi Misra in CityLab: “I know it when I see it,” is as true for gentrification as it is for pornography. Usually, it’s when a neighborhood’s property values and demographics are already changing that the worries about displacement set in—rousing housing advocates and community organizers to action. But by that time, it’s often hard to pause, and put in safeguards for the neighborhood’s most vulnerable residents.

But what if there was an early warning system that detects where price appreciation or decline is about to occur? Predictive tools like this have been developed around the country, most notably by researchers in San Francisco. And their value is clear: city leaders and non-profits pinpoint where to preserve existing affordable housing, where to build more, and where to attract business investment ahead of time. But they’re often too academic or too obscure, which is why it’s not yet clear how they’re being used by policymakers and planners.

That’s the problem Ken Steif, at the University of Pennsylvania, is working to solve, in partnership with Alan Mallach, from the Center for Community Progress.

Mallach’s non-profit focused on revitalizing distressed neighborhoods, particularly in “legacy cities.” These are towns like St. Louis, Flint, Dayton, and Baltimore, that have experienced population loss and economic contraction in recent years, and suffer from property vacancies, blight, and unemployment. Mallach is interested in understanding which neighborhoods are likely to continue down that path, and which ones will do a 180-degree turn. Right now, he can intuitively make those predictions, based on his observations on neighborhood characteristics like housing stock, median income, and race. But an objective assessment can help confirm or deny his hypotheses.

That’s where Steif comes in. Having consulted with cities and non-profits on place-based data analytics, Steif has developed a number of algorithms that predict the movement of housing markets using expensive private data from entities like Zillow. Mallach suggested he try his algorithms on Census data, which is free and standardized.

The phenomenon he tested was  ‘endogenous gentrification’—this idea that an increase in home prices moves from wealthy neighborhoods to less expensive ones in its vicinity, like a wave. ..Steif used Census data from 1990 and 2000 to predict housing price change in 2010 in 29 big and small legacy cities. His algorithms took into account the relationship between the median home prices of a census tract to the ones around it, the proximity of census tracts to high-cost areas, and the spatial patterns in home price distribution. It also folded in variables like race, income and housing supply, among others.

After cross-checking the 2010 prediction with actual home prices, he projected the neighborhood change all the way to 2020. His algorithms were able to compute the speed and breadth of the wave of gentrification over time reasonably well, overall…(More)”.

Think tanks can transform into the standard-setters and arbiters of quality of 21st century policy analysis


Marcos Hernando, Diane Stone and Hartwig Pautz in LSE Impact Blog: “Last month, the annual Global GoTo Think Tank Index Report was released, amid claims “think tanks are more important than ever before”. It is unclear whether this was said in spite of, or because of, the emergence of ‘post-truth politics’. Experts have become targets of anger and derision, struggling to communicate facts and advance evidence-based policy. Popular dissatisfaction with ‘policy wonks’ has meant think tanks face challenges to their credibility at a time they are under pressure from increased competition. The 20th century witnessed the rise of the think tank, but the 21st century might yet see its decline. To avoid such a fate, we believe think tanks must reposition themselves as the credible arbiters able to distinguish between poor analysis and good quality research….

In recent years, think tanks have faced three major challenges: financial limits in a world characterised by austerity; increased competition both among think tanks and with other types of policy research organisations; and a growing questioning of, and popular dissatisfaction with, the role of the ‘expert’ itself. Here, we look at each of these in turn..

Nevertheless, think tanks do retain some competitive advantages. The rapid proliferation of knowledge complicates the absorption of information among policymakers. To put it simply, there are limits to the quantity and diversity of knowledge that government actors can make sense of, especially in states hollowed out by austerity programmes and burdened by ever-higher public demands. Managing the over-supply of (occasionally dubious) evidence and policy analysis from research-based NGOs, universities and advocacy groups has become a problem of governance. But this issue also opens a space for the reinvention of think tanks.

With information overload comes a need for talented editors and skilled curators. That is, organisations as much as individuals which help those within policy processes to discern the reliability and usefulness of analytic products. Potentially, think tanks could transform into significant standard-setters and arbiters of quality of 21st century policy analysis. If they do not, they risk becoming just another group in the overpopulated ‘post-truth’ policy advice industry….(More)”

Why Big Data Is a Big Deal for Cities


John M. Kamensky in Governing: “We hear a lot about “big data” and its potential value to government. But is it really fulfilling the high expectations that advocates have assigned to it? Is it really producing better public-sector decisions? It may be years before we have definitive answers to those questions, but new research suggests that it’s worth paying a lot of attention to.

University of Kansas Prof. Alfred Ho recently surveyed 65 mid-size and large cities to learn what is going on, on the front line, with the use of big data in making decisions. He found that big data has made it possible to “change the time span of a decision-making cycle by allowing real-time analysis of data to instantly inform decision-making.” This decision-making occurs in areas as diverse as program management, strategic planning, budgeting, performance reporting and citizen engagement.

Cities are natural repositories of big data that can be integrated and analyzed for policy- and program-management purposes. These repositories include data from public safety, education, health and social services, environment and energy, culture and recreation, and community and business development. They include both structured data, such as financial and tax transactions, and unstructured data, such as recorded sounds from gunshots and videos of pedestrian movement patterns. And they include data supplied by the public, such as the Boston residents who use a phone app to measure road quality and report problems.

These data repositories, Ho writes, are “fundamental building blocks,” but the challenge is to shift the ownership of data from separate departments to an integrated platform where the data can be shared.

There’s plenty of evidence that cities are moving in that direction and that they already are systematically using big data to make operational decisions. Among the 65 cities that Ho examined, he found that 49 have “some form of data analytics initiatives or projects” and that 30 have established “a multi-departmental team structure to do strategic planning for these data initiatives.”….The effective use of big data can lead to dialogs that cut across school-district, city, county, business and nonprofit-sector boundaries. But more importantly, it provides city leaders with the capacity to respond to citizens’ concerns more quickly and effectively….(More)”