If you build it… will they come?


Laura Bacon at Omidyar Network: “What do datasets on Danish addresses, Indonesian elections, Singapore Dengue Fever, Slovakian contracts, Uruguayan health service provision, and Global weather systems have in common? Read on to learn more…

On May 12, 2016, more than 40 nations’ leaders gathered in London for an Anti-Corruption Summit, convened by UK Prime Minister David Cameron. Among the commitments made, 40 countries pledged to make their procurement processes open by default, with 14 countries specifically committing to publish to the Open Contracting Data Standard.

This conference and these commitments can be seen as part of a larger global norm toward openness and transparency, also embodied by the Open Government Partnership, Open Data Charter, and increasing numbers of Open Data Portals.

As government data is increasingly published openly in the public domain, valid questions have been raised about what impact the data will have: As governments release this data, will it be accessed and used? Will it ultimately improve lives, root out corruption, hold answers to seemingly intractable problems, and lead to economic growth?*

Omidyar Network — having supported several Open Data organizations and platforms such as Open Data Institute, Open Knowledge, and Web Foundation — sought data-driven answers to these questions. After a public call for proposals, we selected NYU’s GovLab to conduct research on the impact open data has already had. Not the potential or prospect of impact, but past proven impact. The GovLab research team, led by Stefaan Verhulst, investigated a variety of sectors — health, education, elections, budgets, contracts, etc. — in a variety of locations, spanning five continents.

Their findings are promising and exciting, demonstrating that open data is changing the world by empowering people, improving governance, solving public problems, and leading to innovation. A summary is contained in thisKey Findings report, and is accompanied by many open data case studies posted in this Open Data Impact Repository.

Of course, stories such as this are not 100% rosy, and the report is clear about the challenges ahead. There are plenty of cases in which open data has had minimal impact. There are cases where there was negative impact. And there are obstacles to open data reaching its full potential: namely, open data projects that don’t respond to citizens’ questions and needs, a lack of technical capacity on either the data provider and data user side, inadequate protections for privacy and security, and a shortage of resources.

But this research holds good news: Danish addresses, Indonesian elections,Singapore Dengue Fever, Slovakian contracts, Uruguayan health service provision, Global weather systems, and others were all opened up. And all changed the world by empowering citizens, improving governance, solving public problems, and leading to innovation. Please see this report for more….(More)”

See also odimpact.org

Big Data for public policy: the quadruple helix


Julia Lane in the Journal of Policy Analysis and Management: “Data from the federal statistical system, particularly the Census Bureau, have long been a key resource for public policy. Although most of those data have been collected through purposive surveys, there have been enormous strides in the use of administrative records on business (Jarmin & Miranda, 2002), jobs (Abowd, Halti- wanger, & Lane, 2004), and individuals (Wagner & Layne, 2014). Those strides are now becoming institutionalized. The President has allocated $10 million to an Administrative Records Clearing House in his FY2016 budget. Congress is considering a bill to use administrative records, entitled the Evidence-Based Policymaking Commission Act, sponsored by Patty Murray and Paul Ryan. In addition, the Census Bureau has established a Center for “Big Data.” In my view, these steps represent important strides for public policy, but they are only part of the story. Public policy researchers must look beyond the federal statistical system and make use of the vast resources now available for research and evaluation.

All politics is local; “Big Data” now mean that policy analysis can increasingly be local. Modern empirical policy should be grounded in data provided by a network of city/university data centers. Public policy schools should partner with scholars in the emerging field of data science to train the next generation of policy researchers in the thoughtful use of the new types of data; the apparent secular decline in the applications to public policy schools is coincident with the emergence of data science as a field of study in its own right. The role of national statistical agencies should be fundamentally rethought—and reformulated to one of four necessary strands in the data infrastructure; that of providing benchmarks, confidentiality protections, and national statistics….(More)”

The New Power Politics: Networks and Transnational Security Governance


Book edited by Deborah Avant and Oliver Westerwinter: “Traditional analyses of global security cannot explain the degree to which there is “governance” of important security issues — from combatting piracy to curtailing nuclear proliferation to reducing the contributions of extractive industries to violence and conflict. They are even less able to explain why contemporary governance schemes involve the various actors and take the many forms they do.

Juxtaposing the insights of scholars writing about new modes of governance with the logic of network theory, The New Power Politics offers a framework for understanding contemporary security governance and its variation. The framework rests on a fresh view of power and how it works in global politics. Though power is integral to governance, it is something that emerges from, and depends on, relationships. Thus, power is dynamic; it is something that governors must continually cultivate with a wide range of consequential global players, and how a governor uses power in one situation can have consequences for her future relationships, and thus, future power.

Understanding this new power politics is crucial for explaining and shaping the future of global security politics. This stellar group of scholars analyzes both the networking strategies of would-be governors and their impacts on the effectiveness of governance and whether it reflects broad or narrow concerns on a wide range of contemporary governance issues….(More)”

Crowdsourcing global governance: sustainable development goals, civil society, and the pursuit of democratic legitimacy


Paper by Joshua C. Gellers in International Environmental Agreements: Politics, Law and Economics: “To what extent can crowdsourcing help members of civil society overcome the democratic deficit in global environmental governance? In this paper, I evaluate the utility of crowdsourcing as a tool for participatory agenda-setting in the realm of post-2015 sustainable development policy. In particular, I analyze the descriptive representativeness (e.g., the degree to which participation mirrors the demographic attributes of non-state actors comprising global civil society) of participants in two United Nations orchestrated crowdsourcing processes—the MY World survey and e-discussions regarding environmental sustainability. I find that there exists a perceptible demographic imbalance among contributors to the MY World survey and considerable dissonance between the characteristics of participants in the e-discussions and those whose voices were included in the resulting summary report. The results suggest that although crowdsourcing may present an attractive technological approach to expand participation in global governance, ultimately the representativeness of that participation and the legitimacy of policy outputs depend on the manner in which contributions are solicited and filtered by international institutions….(More)”

In the future, Big Data will make actual voting obsolete


Robert Epstein at Quartz: “Because I conduct research on how the Internet affects elections, journalists have lately been asking me about the primaries. Here are the two most common questions I’ve been getting:

  • Do Google’s search rankings affect how people vote?
  • How well does Google Trends predict the winner of each primary?

My answer to the first question is: Probably, but no one knows for sure. From research I have been conducting in recent years with Ronald E. Robertson, my associate at the American Institute for Behavioral Research and Technology, on the Search Engine Manipulation Effect (SEME, pronounced “seem”), we know that when higher search results make one candidate look better than another, an enormous number of votes will be driven toward the higher-ranked candidate—up to 80% of undecided voters in some demographic groups. This is partly because we have all learned to trust high-ranked search results, but it is mainly because we are lazy; search engine users generally click on just the top one or two items.

Because no one actually tracks search rankings, however—they are ephemeral and personalized, after all, which makes them virtually impossible to track—and because no whistleblowers have yet come forward from any of the search engine companies,

We cannot know for sure whether search rankings are consistently favoring one candidate or another.This means we also cannot know for sure how search rankings are affecting elections. We know the power they have to do so, but that’s it.
As for the question about Google Trends, for a while I was giving a mindless, common-sense answer: Well, I said, Google Trends tells you about search activity, and if lots more people are searching for “Donald Trump” than for “Ted Cruz” just before a primary, then more people will probably vote for Trump.

When you run the numbers, search activity seems to be a pretty good predictor of voting. On primary day in New Hampshire this year, search traffic on Google Trends was highest for Trump, followed by John Kasich, then Cruz—and so went the vote. But careful studies of the predictive power of search activity have actually gotten mixed results. A 2011 study by researchers at Wellesley College in Massachusetts, for example, found that Google Trends was a poor predictor of the outcomes of the 2008 and 2010 elections.

So much for Trends. But then I got to thinking: Why are we struggling so hard to figure out how to use Trends or tweets or shares to predict elections when Google actually knows exactly how we are going to vote. Impossible, you say? Think again….

This leaves us with two questions, one small and practical and the other big and weird.

The small, practical question is: How is Google using those numbers? Might they be sharing them with their preferred presidential candidate, for example? That is not unlawful, after all, and Google executives have taken a hands-on role in past presidential campaigns. The Wall Street Journal reported, for example, that Eric Schmidt, head of Google at that time, was personally overseeing Barack Obama’s programming team at his campaign headquarters the night before the 2012 election.
And the big, weird question is: Why are we even bothering to vote?
 Voting is such a hassle—the parking, the lines, the ID checks. Maybe we should all stay home and just let Google announce the winners….(More)”

Poli-hobbyism: A Theory of Mass Politics


Eitan D. Hersh: “For many citizens, participation in politics is not motivated by civic duty or selfinterest, but by hobbyism: the objective is self-gratification. I offer a theory of political hobbyism, situate the theory in existing literature, and define and distinguish the hobbyist motivation from its alternatives. I argue that the prevalence of political hobbyism depends on historical conditions related to the nature of leisure time, the openness of the political process to mass participation, and the level of perceived threat. I articulate an empirical research agenda, highlighting how poli-hobbyism can help explain characteristics of participants, forms of participation, rates of participation, and the nature of partisanship. Political hobbyism presents serious problems for a functioning democracy, including participants confusing high stakes for low stakes, participation too focused on the gratifying aspects of politics, and unnecessarily potent partisan rivalries….(More)”

Countable


Countable: “Why does it have to be so hard to understand what our lawmakers are up to?

With Countable, it doesn’t.

Countable makes it quick and easy to understand the laws Congress is considering. We also streamline the process of contacting your lawmaker, so you can tell them how you want them to vote on bills under consideration.

You can use Countable to:

  • Read clear and succinct summaries of upcoming and active legislation.
  • Directly tell your lawmakers how to vote on those bills by clicking “Yea” or “Nay”.
  • Follow up on how your elected officials voted on bills, so you can hold them accountable in the next election cycle….(More)”

The era of development mutants


Guilo Quaggiotto at Nesta: “If you were looking for the cutting edge of the development sector, where would you go these days? You would probably look at startups like Premise who have predicted food trends 25 days faster than national statistics in Brazil, or GiveDirectly who are pushing the boundaries on evidence – from RCTs to new ways of mapping poverty – to fast track the adoption of cash transfers.

Or perhaps you might draw your attention to PetaJakarta who are experimenting with new responses to crises by harnessing human sensor networks. You might be tempted to consider Airbnb’s Disaster Response programme as an indicator of an emerging alternative infrastructure for disaster response (and perhaps raising questions about the political economy of this all).

And could Bitnation’s Refugee Emergency programme in response to the European refugee crisis be the possible precursor of future solutions for transnational issues – among the development sector’s hardest challenges? Are the business models of One Acre Fund, which provides services for smallholder farmers, or Floodtags, which analyses citizen data during floods for water and disaster managers, an indicator of future pathways to scale – that elusive development unicorn?

If you want to look at the future of procuring solutions for the development sector, should you be looking at initiatives like Citymart, which works with municipalities across the world to rethink traditional procurement and unleash the expertise and innovation capabilities of their citizens? By the same token, projects like Pathogen Box, Poverty Stoplight or Patient Innovation point to a brave new world where lead-user innovation and harnessing ‘sticky’ local knowledge becomes the norm, rather than the exception. You would also be forgiven for thinking that social movements across the world are the place to look for signs of future mechanisms for harnessing collective intelligence – Kawal Pamilu’s “citizen experts” self-organising around the Indonesian elections in 2014 is a textbook case study in this department.

The list could go on and on: welcome to the era of development mutants. While established players in the development sector are engrossed in soul-searching and their fitness for purpose is being scrutinised from all quarters, a whole new set of players is emerging, unfettered by legacy and borrowing from a variety of different disciplines. They point to a potentially different future – indeed, many potentially different futures – for the sector…..

But what if we wanted to invert this paradigm? How could we move from denial to fruitful collaboration with the ‘edgeryders’ of the development sector and accelerate its transformation?

Adopting new programming principles

Based on our experience working with development organisations, we believe that partnering with the mutants involves two types of shifts for traditional players: at the programmatic and the operational level. At the programmatic level, our work on the ground led us to articulate the following emerging principles:

  1. Mapping what people have, not what they need: even though approaches like jugaad and positive deviance have been around for a long time, unfortunately the default starting point for many development projects is still mapping needs, not assets. Inverting this paradigm allows for potentially disruptive project design and partnerships to emerge. (Signs of the future: Patient Innovation, Edgeryders, Community Mirror, Premise)

  2. Getting ready for multiple futures: When distributed across an organisation and not limited to a centralised function, the discipline of scanning the horizon for emergent solutions that contradict the dominant paradigm can help move beyond the denial phase and develop new interfaces to collaborate with the mutants. Here the link between analysis (to understand not only what is probable, but also what is possible) and action is critical – otherwise this remains purely an academic exercise. (Signs of the future: OpenCare, Improstuctures, Seeds of Good Anthropocene, Museum of the Future)

  3. Running multiple parallel experiments: According to Dave Snowden, in order to intervene in a complex system “you need multiple parallel experiments and they should be based on different and competing theories/hypotheses”. Unfortunately, many development projects are still based on linear narratives and assumptions such as “if only we run an awareness raising campaign citizens will change their behaviour”. Turning linear narratives into hypotheses to be tested (without becoming religious on a specific approach) opens up the possibility to explore the solution landscape and collaborate with non-obvious partners that bring new approaches to the table. (Signs of the future: Chukua Hakua, GiveDirectly, Finnish PM’s Office of Experiments, Ideas42, Cognitive Edge)

  4. Embracing obliquity: A deep, granular understanding of local assets and dynamics along with system mapping (see point 5 below) and pairing behavioural experts with development practitioners can help identify entry points for exploring new types of intervention based on obliquity principles. Mutants are often faster in adopting this approach and partnering with them is a way to bypass organisational inertia and explore nonlinear interventions. (Signs of the future: Sardex, social prescriptions, forensic architecture)

  5. From projects to systems: development organisations genuinely interested in developing new partnerships need to make the shift from the project logic to system investments. This involves, among other things, shifting the focus from providing solutions to helping every actor in the system to develop a higher level of consciousness about the issues they are facing and to take better decisions over time. It also entails partnering with mutants to explore entirely new financial mechanisms. (Signs of the future: Lankelly Chase, Indonesia waste banks, Dark Matter Labs)

Adopting new interfaces for working with the mutants

Harvard Business School professor Carliss Baldwin argued that most bureaucracies these days have a ‘non-contractible’ problem: they don’t know where smart people are, or how to evaluate how good they are. Most importantly, most smart people don’t want to work for them because they find them either too callous, unrewarding or slow (or a combination of all of these)….(More)”

Friended, but not Friends: Federal Ethics Authorities Address Role of Social Media in Politics


CRS Reports & Analysis: “Since the rise of social media over the past decade, new platforms of technology have reinforced the adage that the law lags behind developments in technology. Government agencies, officials, and employees regularly use a number of social media options – e.g., Twitter, Facebook, etc. – that have led agencies to update existing ethics rules to reflect the unique issues that they may present. Two areas of ethics regulation affected by the increased role of social media are the ethical standards governing gifts to federal employees and the restrictions on employees’ political activities. These rules apply to employees in the executive branch, though separate ethics rules and guidance on similar topics apply to the House and Senate….(More)”

What Should We Do About Big Data Leaks?


Paul Ford at the New Republic: “I have a great fondness for government data, and the government has a great fondness for making more of it. Federal elections financial data, for example, with every contribution identified, connected to a name and address. Or the results of the census. I don’t know if you’ve ever had the experience of downloading census data but it’s pretty exciting. You can hold America on your hard drive! Meditate on the miracles of zip codes, the way the country is held together and addressable by arbitrary sets of digits.

You can download whole books, in PDF format, about the foreign policy of the Reagan Administration as it related to Russia. Negotiations over which door the Soviet ambassador would use to enter a building. Gigabytes and gigabytes of pure joy for the ephemeralist. The government is the greatest creator of ephemera ever.

Consider the Financial Crisis Inquiry Commission, or FCIC, created in 2009 to figure out exactly how the global economic pooch was screwed. The FCIC has made so much data, and has done an admirable job (caveats noted below) of arranging it. So much stuff. There are reams of treasure on a single FCIC web site, hosted at Stanford Law School: Hundreds of MP3 files, for example, with interviews with Jamie Dimonof JPMorgan Chase and Lloyd Blankfein of Goldman Sachs. I am desperate to find  time to write some code that automatically extracts random audio snippets from each and puts them on top of a slow ambient drone with plenty of reverb, so that I can relax to the dulcet tones of the financial industry explaining away its failings. (There’s a Paul Krugman interview that I assume is more critical.)

The recordings are just the beginning. They’ve released so many documents, and with the documents, a finding aid that you can download in handy PDF format, which will tell you where to, well, find things, pointing to thousands of documents. That aid alone is 1,439 pages.

Look, it is excellent that this exists, in public, on the web. But it also presents a very contemporary problem: What is transparency in the age of massive database drops? The data is available, but locked in MP3s and PDFs and other documents; it’s not searchable in the way a web page is searchable, not easy to comment on or share.

Consider the WikiLeaks release of State Department cables. They were exhausting, there were so many of them, they were in all caps. Or the trove of data Edward Snowden gathered on aUSB drive, or Chelsea Manning on CD. And the Ashley Madison leak, spread across database files and logs of credit card receipts. The massive and sprawling Sony leak, complete with whole email inboxes. And with the just-released Panama Papers, we see two exciting new developments: First, the consortium of media organizations that managed the leak actually came together and collectively, well, branded the papers, down to a hashtag (#panamapapers), informational website, etc. Second, the size of the leak itself—2.5 terabytes!—become a talking point, even though that exact description of what was contained within those terabytes was harder to understand. This, said the consortia of journalists that notably did not include The New York Times, The Washington Post, etc., is the big one. Stay tuned. And we are. But the fact remains: These artifacts are not accessible to any but the most assiduous amateur conspiracist; they’re the domain of professionals with the time and money to deal with them. Who else could be bothered?

If you watched the movie Spotlight, you saw journalists at work, pawing through reams of documents, going through, essentially, phone books. I am an inveterate downloader of such things. I love what they represent. And I’m also comfortable with many-gigabyte corpora spread across web sites. I know how to fetch data, how to consolidate it, and how to search it. I share this skill set with many data journalists, and these capacities have, in some ways, become the sole province of the media. Organs of journalism are among the only remaining cultural institutions that can fund investigations of this size and tease the data apart, identifying linkages and thus constructing informational webs that can, with great effort, be turned into narratives, yielding something like what we call “a story” or “the truth.” 

Spotlight was set around 2001, and it features a lot of people looking at things on paper. The problem has changed greatly since then: The data is everywhere. The media has been forced into a new cultural role, that of the arbiter of the giant and semi-legal database. ProPublica, a nonprofit that does a great deal of data gathering and data journalism and then shares its findings with other media outlets, is one example; it funded a project called DocumentCloud with other media organizations that simplifies the process of searching through giant piles of PDFs (e.g., court records, or the results of Freedom of Information Act requests).

At some level the sheer boredom and drudgery of managing these large data leaks make them immune to casual interest; even the Ashley Madison leak, which I downloaded, was basically an opaque pile of data and really quite boring unless you had some motive to poke around.

If this is the age of the citizen journalist, or at least the citizen opinion columnist, it’s also the age of the data journalist, with the news media acting as product managers of data leaks, making the information usable, browsable, attractive. There is an uneasy partnership between leakers and the media, just as there is an uneasy partnership between the press and the government, which would like some credit for its efforts, thank you very much, and wouldn’t mind if you gave it some points for transparency while you’re at it.

Pause for a second. There’s a glut of data, but most of it comes to us in ugly formats. What would happen if the things released in the interest of transparency were released in actual transparent formats?…(More)”