The Dark Side of Sunlight


Essay by James D’Angelo and Brent Ranalli in Foreign Affairs: “…76 percent of Americans, according to a Gallup poll, disapprove of Congress.

This dysfunction started well before the Trump presidency. It has been growing for decades, despite promise after promise and proposal after proposal to reverse it. Many explanations have been offered, from the rise of partisan media to the growth of gerrymandering to the explosion of corporate money. But one of the most important causes is usually overlooked: transparency. Something usually seen as an antidote to corruption and bad government, it turns out, is leading to both.

The problem began in 1970, when a group of liberal Democrats in the House of Representatives spearheaded the passage of new rules known as “sunshine reforms.” Advertised as measures that would make legislators more accountable to their constituents, these changes increased the number of votes that were recorded and allowed members of the public to attend previously off-limits committee meetings.

But the reforms backfired. By diminishing secrecy, they opened up the legislative process to a host of actors—corporations, special interests, foreign governments, members of the executive branch—that pay far greater attention to the thousands of votes taken each session than the public does. The reforms also deprived members of Congress of the privacy they once relied on to forge compromises with political opponents behind closed doors, and they encouraged them to bring useless amendments to the floor for the sole purpose of political theater.

Fifty years on, the results of this experiment in transparency are in. When lawmakers are treated like minors in need of constant supervision, it is special interests that benefit, since they are the ones doing the supervising. And when politicians are given every incentive to play to their base, politics grows more partisan and dysfunctional. In order for Congress to better serve the public, it has to be allowed to do more of its work out of public view.

The idea of open government enjoys nearly universal support. Almost every modern president has paid lip service to it. (Even the famously paranoid Richard Nixon said, “When information which properly belongs to the public is systematically withheld by those in power, the people soon become ignorant of their own affairs, distrustful of those who manage them, and—eventually—incapable of determining their own destinies.”) From former Republican Speaker of the House Paul Ryan to Democratic Speaker of the House Nancy Pelosi, from the liberal activist Ralph Nader to the anti-tax crusader Grover Norquist, all agree that when it comes to transparency, more is better.

It was not always this way. It used to be that secrecy was seen as essential to good government, especially when it came to crafting legislation. …(More)”

We’ll soon know the exact air pollution from every power plant in the world. That’s huge.


David Roberts at Vox: “A nonprofit artificial intelligence firm called WattTime is going to use satellite imagery to precisely track the air pollution (including carbon emissions) coming out of every single power plant in the world, in real time. And it’s going to make the data public.

This is a very big deal. Poor monitoring and gaming of emissions data have made it difficult to enforce pollution restrictions on power plants. This system promises to effectively eliminate poor monitoring and gaming of emissions data….

The plan is to use data from satellites that make theirs publicly available (like the European Union’s Copernicus network and the US Landsat network), as well as data from a few private companies that charge for their data (like Digital Globe). The data will come from a variety of sensors operating at different wavelengths, including thermal infrared that can detect heat.

The images will be processed by various algorithms to detect signs of emissions. It has already been demonstrated that a great deal of pollution can be tracked simply through identifying visible smoke. WattTime says it can also use infrared imaging to identify heat from smokestack plumes or cooling-water discharge. Sensors that can directly track NO2 emissions are in development, according to WattTime executive director Gavin McCormick.

Between visible smoke, heat, and NO2, WattTime will be able to derive exact, real-time emissions information, including information on carbon emissions, for every power plant in the world. (McCormick says the data may also be used to derive information about water pollutants like nitrates or mercury.)

Google.org, Google’s philanthropic wing, is getting the project off the ground (pardon the pun) with a $1.7 million grant; it was selected through the Google AI Impact Challenge….(More)”.

Belgium’s democratic experiment


David van Reybrouck in Politico: “Those looking for a solution to the wave of anger and distrust sweeping Western democracies should have a look at an experiment in European democracy taking place in a small region in eastern Belgium.

Starting in September, the parliament representing the German-speaking region of Belgium will hand some of its powers to a citizens’ assembly drafted by lot. It’ll be the first time a political institution creates a permanent structure to involve citizens in political decision making.

It’s a move Belgian media has rightly hailed as “historic.” I was in parliament the night MPs from all six parties moved past ideological differences to endorse the bill. It was a courageous move, a sign to other politicians — who tend to see their voters as a threat rather than a resource — that citizens should be trusted, not feared, or “spun.”

Nowhere else in the world will everyday citizens be so consistently involved in shaping the future of their community. In times of massive, widespread distrust of party politics, German-speaking Belgians will be empowered to put the issues they care about on the agenda, to discuss potential solutions, and to monitor the follow-up of their recommendations as they pass through parliament and government. Politicians, in turn, will be able to tap independent citizens’ panels to deliberate over thorny political issues.

This experiment is happening on a small scale: Belgium’s German-speaking community, the country’s third linguistic region, is the smallest federal entity in Europe. But its powers are comparable with those of Scotland or the German province of North Rhine-Westphalia, and the lessons of its experiment with a “people’s senate” will have implications for democrats across Europe….(More)”.

A New Way of Voting That Makes Zealotry Expensive


Peter Coy at Bloomberg Business Week: “An intriguing new tool of democracy just had its first test in the real world of politics, and it passed with flying colors.

The tool is called quadratic voting, and it’s just as nerdy as it sounds. The concept is that each voter is given a certain number of tokens—say, 100—to spend as he or she sees fit on votes for a variety of candidates or issues. Casting one vote for one candidate or issue costs one token, but two votes cost four tokens, three votes cost nine tokens, and so on up to 10 votes costing all 100 of your tokens. In other words, if you really care about one candidate or issue, you can cast up to 10 votes for him, her, or it, but it’s going to cost you all your tokens.

Quadratic voting was invented not by political scientists but by economists and others, including Glen Weyl, an economist and principal researcher at Microsoft Corp. The purpose of quadratic voting is to determine “whether the intense preferences of the minority outweigh the weak preferences of the majority,” Weyl and Eric Posner, a University of Chicago Law School professor, wrote last year in an important book called Radical Markets: Uprooting Capitalism and Democracy for a Just Society. ….

This spring, quadratic voting was used in a successful experiment by the Democratic caucus of the Colorado House of Representatives. The lawmakers used it to decide on their legislative priorities for the coming two years among 107 possible bills. (Wiredmagazine wrote about it here.)…

In this year’s experiment, the 41 lawmakers in the Democratic caucus were given 100 tokens each to allocate among the 107 bills. No one chose to spend all 100 tokens on a single bill. Many of them spread their votes around widely but thinly because it was inexpensive to do so—one vote is just one token. The top vote-getter by a wide margin turned out to be a bill guaranteeing equal pay to women for equal work. “There was clear separation” of the favorites from the also-rans, Hansen says.

The computer interface and other logistics were provided by Democracy Earth, which describes itself as a borderless community and “a global commons of self-sovereign citizens.” The lawmakers had more immediate concerns—hammering out a party agenda. “Some members were more tech-savvy,” Hansen says. “Some started skeptical but came around. I was pleasantly surprised. There was this feeling of ownership—your voice being heard.”

I recently wrote about the democratic benefits of ranked-choice voting, in which voters rank all the candidates in a race and votes are reassigned from the lowest vote-getters to the higher finishers until someone winds up with a majority. But although ranked-choice voting is gaining in popularity, it traces its roots back to the 19th century. Quadratic voting is much more of a break from the past. “This is a new idea, which is rare in economic theory, so it should be saluted as such, especially since it is accompanied by outstanding execution,” George Mason University economist Tyler Cowen wrote in 2015. (He did express some cautions about it as well.)…(More)”.

As Surveys Falter Big Data Polling Narrows Our Societal Understanding


Kalev Leetaru at Forbes: “One of the most talked-about stories in the world of polling and survey research in recent years has been the gradual death of survey response rates and the reliability of those insights….

The online world’s perceived anonymity has offered some degree of reprieve in which online polls and surveys have often bested traditional approaches in assessing views towards society’s most controversial issues. Yet, here as well increasing public understanding of phishing and online safety are ever more problematic.

The answer has been the rise of “big data” analysis of society’s digital exhaust to fill in the gaps….

Is it truly the same answer though?

Constructing and conducting a well-designed survey means being able to ask the public exactly the questions of interest. Most importantly, it entails being able to ensure representative demographics of respondents.

An online-only poll is unlikely to accurately capture the perspectives of the three quarters of the earth’s population that the digital revolution has left behind. Even within the US, social media platforms are extraordinarily skewed.

The far greater problem is that society’s data exhaust is rarely a perfect match for the questions of greatest interest to policymakers and public.

Cellphone mobility records can offer an exquisitely detailed look at how the people of a city go about their daily lives, but beneath all that blinding light are the invisible members of society not deemed valuable to advertisers and thus not counted. Even for the urban society members whose phones are their ever-present companions, mobility data only goes so far. It can tell us that occupants of a particular part of the city during the workday spend their evenings in a particular part of the city, allowing us to understand their work/life balance, but it offers few insights into their political leanings.

One of the greatest challenges of today’s “big data” surveying is that it requires us to narrow our gaze to only those questions which can be easily answered from the data at hand.

Much as AI’s crisis of bias comes from the field’s steadfast refusal to pay for quality data, settling for highly biased free data, so too has “big data” surveying limited itself largely to datasets it can freely and easily acquire.

The result is that with traditional survey research, we are free to ask the precise questions we are most interested in. With data exhaust research, we must imperfectly shoehorn our questions into the few available metrics. With sufficient creativity it is typically possible to find some way of proxying the given question, but the resulting proxies may be highly unstable, with little understanding of when and where they may fail.

Much like how the early rise of the cluster computing era caused “big data” researchers to limit the questions they asked of their data to just those they could fit into a set of tiny machines, so too has the era of data exhaust surveying forced us to greatly restrict our understanding of society.

Most dangerously, however, big data surveying implicitly means we are measuring only the portion of society our vast commercial surveillance state cares about.

In short, we are only able to measure those deemed of greatest interest to advertisers and thus the most monetizable.

Putting this all together, the decline of traditional survey research has led to the rise of “big data” analysis of society’s data exhaust. Instead of giving us an unprecedented new view into the heartbeat of daily life, this reliance on the unintended output of our digital lives has forced researchers to greatly narrow the questions they can explore and severely skews them to the most “monetizable” portions of society.

In the end, the shift of societal understanding from precision surveys to the big data revolution has led not to an incredible new understanding of what makes us tick, but rather a far smaller, less precise and less accurate view than ever before, just our need to understand ourselves has never been greater….(More)”.

Reconnecting citizens with EU decision-making is possible – and needs to happen now


Opinion piece by Anthony Zacharzewski: “Maybe it’s the Brexit effect, or perhaps the memories of the great recession are fading, but in poll after poll, Europe’s citizens are saying that they feel more European and strongly supportive of EU membership. …

While sighs of relief can be heard from Schuman to Strasbourg, after a decade where the EU has bounced from crisis to crisis, the new Parliament and Commission will inherit a fragile and fractious Europe this year. One of their most important tasks will immediately be to connect EU citizens more closely to the institutions and their decision making….

The new European Commission and Parliament have the chance to change that, by adopting an ambitious open government agenda that puts citizen participation in decision making at its heart.

There are three things on our wish list for doing this.

The first thing on our list is an EU-wide commitment to policy making “in the open.” Built on a renewed commitment to transparency, it would set a unified approach to consultation, as well as identifying major policy areas where citizen involvement is both valuable and where citizens are likely to want to be involved. This could include issues such as migration and climate change. Member states, particularly those who are in the Open Government Partnership, have already had a lot of good practice which can help to inform this while the Open Government Network for Europe, which brings together civil society and government voices, is ready to help.

Secondly, the connection to civil society and citizens also needs to be made beyond the European level, supporting and making use of the rapidly growing networks of democratic innovation at local level. We are seeing an increasing shift from citizen participation as one-off events into a part of the governing system, and as such, the European institutions need to listen to local conversations and support them with better information. Public Square, our own project run in partnership with mySociety and funded by Luminate, is a good example. It is working with local government and citizens to understand how meaningful citizen participation can become an everyday part of the way all local decision-making happens.

The last item on our wish list would be greater coherence between the institutions in Brussels and Strasbourg to better involve citizens. While the European Parliament, Commission and Council all have their different roles and prerogatives, without a co-ordinated approach, the attention and resources they have will be dissipated across multiple conversations. Most importantly, it will be harder to demonstrate to citizens that their contributions have made a difference….(More)”.

A weather tech startup wants to do forecasts based on cell phone signals


Douglas Heaven at MIT Technology Review: “On 14 April more snow fell on Chicago than it had in nearly 40 years. Weather services didn’t see it coming: they forecast one or two inches at worst. But when the late winter snowstorm came it caused widespread disruption, dumping enough snow that airlines had to cancel more than 700 flights across all of the city’s airports.

One airline did better than most, however. Instead of relying on the usual weather forecasts, it listened to ClimaCell – a Boston-based “weather tech” start-up that claims it can predict the weather more accurately than anyone else. According to the company, its correct forecast of the severity of the coming snowstorm allowed the airline to better manage its schedules and minimize losses due to delays and diversions. 

Founded in 2015, ClimaCell has spent the last few years developing the technology and business relationships that allow it to tap into millions of signals from cell phones and other wireless devices around the world. It uses the quality of these signals as a proxy for local weather conditions, such as precipitation and air quality. It also analyzes images from street cameras. It is offering a weather forecasting service to subscribers that it claims is 60 percent more accurate than that of existing providers, such as NOAA.

The internet of weather

The approach makes sense, in principle. Other forecasters use proxies, such as radar signals. But by using information from millions of everyday wireless devices, ClimaCell claims it has a far more fine-grained view of most of the globe than other forecasters get from the existing network of weather sensors, which range from ground-based devices to satellites. (ClimaCell also taps into these, too.)…(More)”.

How Technology Could Revolutionize Refugee Resettlement


Krishnadev Calamur in The Atlantic: “… For nearly 70 years, the process of interviewing, allocating, and accepting refugees has gone largely unchanged. In 1951, 145 countries came together in Geneva, Switzerland, to sign the Refugee Convention, the pact that defines who is a refugee, what refugees’ rights are, and what legal obligations states have to protect them.

This process was born of the idealism of the postwar years—an attempt to make certain that those fleeing war or persecution could find safety so that horrific moments in history, such as the Holocaust, didn’t recur. The pact may have been far from perfect, but in successive years, it was a lifeline to Afghans, Bosnians, Kurds, and others displaced by conflict.

The world is a much different place now, though. The rise of populism has brought with it a concomitant hostility toward immigrants in general and refugees in particular. Last October, a gunman who had previously posted anti-Semitic messages online against HIAS killed 11 worshippers in a Pittsburgh synagogue. Many of the policy arguments over resettlement have shifted focus from humanitarian relief to security threats and cost. The Trump administration has drastically cut the number of refugees the United States accepts, and large parts of Europe are following suit.

If it works, Annie could change that dynamic. Developed at Worcester Polytechnic Institute in Massachusetts, Lund University in Sweden, and the University of Oxford in Britain, the software uses what’s known as a matching algorithm to allocate refugees with no ties to the United States to their new homes. (Refugees with ties to the United States are resettled in places where they have family or community support; software isn’t involved in the process.)

Annie’s algorithm is based on a machine learning model in which a computer is fed huge piles of data from past placements, so that the program can refine its future recommendations. The system examines a series of variables—physical ailments, age, levels of education and languages spoken, for example—related to each refugee case. In other words, the software uses previous outcomes and current constraints to recommend where a refugee is most likely to succeed. Every city where HIAS has an office or an affiliate is given a score for each refugee. The higher the score, the better the match.

This is a drastic departure from how refugees are typically resettled. Each week, HIAS and the eight other agencies that allocate refugees in the United States make their decisions based largely on local capacity, with limited emphasis on individual characteristics or needs….(More)”.

LAPD moving away data-driven crime programs over potential racial bias


Mark Puente in The Los Angeles Times: “The Los Angeles Police Department pioneered the controversial use of data to pinpoint crime hot spots and track violent offenders.

Complex algorithms and vast databases were supposed to revolutionize crime fighting, making policing more efficient as number-crunching computers helped to position scarce resources.

But critics long complained about inherent bias in the data — gathered by officers — that underpinned the tools.

They claimed a partial victory when LAPD Chief Michel Moore announced he would end one highly touted program intended to identify and monitor violent criminals. On Tuesday, the department’s civilian oversight panel raised questions about whether another program, aimed at reducing property crime, also disproportionately targets black and Latino communities.

Members of the Police Commission demanded more information about how the agency plans to overhaul a data program that helps predict where and when crimes will likely occur. One questioned why the program couldn’t be suspended.

“There is very limited information” on the program’s impact, Commissioner Shane Murphy Goldsmith said.

The action came as so-called predictive policing— using search tools, point scores and other methods — is under increasing scrutiny by privacy and civil liberties groups that say the tactics result in heavier policing of black and Latino communities. The argument was underscored at Tuesday’s commission meeting when several UCLA academics cast doubt on the research behind crime modeling and predictive policing….(More)”.

How nudge theory is ageing well


Julian Baggini at the Financial Times: “A decade ago, Cass Sunstein and Richard Thaler’s book Nudge was on the desk of every serious politician and policy wonk. Its central thesis was alluringly simple: by changing the environment in which we make decisions — the “choice architecture” — people could be encouraged to do things that were good for them and for society without governments compelling them to do anything.

The idea hit the liberal sweet-spot, promising maximum social impact for minimal interference with personal freedom. In 2010, Britain’s government set up its Behavioural Insights Team — popularly known as the “nudge unit” — to put these ideas into practice.

Around the world, others followed. Sunstein is justly proud that 10m poor American children now get free breakfast and lunch during the academic year as a result of just one such intervention making enrolment for free school meals automatic.

Ten years on, Sunstein has produced two new books to win over the unconverted and boost the faith of true believers. One, On Freedom, is a tiny, commuter-friendly pamphlet between hard covers. The other, Trusting Nudges, co-authored with the behavioural economist Lucia A Reisch, is a short, thoughtful, measured and important analysis of what citizens actually think about nudging and why that matters — albeit with the dry, academic furniture of endless tables, footnotes and technical appendices.

Despite the stylistic gulf between them, the two books are best read together as a response to those who would like to give nudges the nudge, claiming that they are covert, manipulative, an insult to human agency and place too much trust in governments and too little on human reason. Not only that, but for all the hype, nudges only work at the margins, delivering relatively minor results without having any major impact on poverty, inequity or inequality.

On Freedom economically and elegantly takes apart the accusation that nudges undermine liberty. Sunstein rightly points out that a nudge is only a nudge by definition if it leaves the nudged able to choose otherwise. For example, the system adopted by several jurisdictions to put people on organ donation registers by default carries with it the right to opt out. Nor are the best nudges covert.

There may not be a sign at the canteen telling you that healthy foods have been put at the front because that’s where you’re more likely to choose them but organisations that adopt this as a policy can and should do so openly. Sunstein’s most important argument is that “we cannot wish choice architecture away”: something has to be on the supermarket shelves that people tend to take more from, something has to be the default for benefit claims. The question is not whether we nudge but how we do so: with forethought or without….(More)”