The Promise of Evidence-Based Policymaking


Final Report by the Commission on Evidence-Based Policymaking: “…There are many barriers to the efective use of government data to generate evidence. Better access to these data holds the potential for substantial gains for society. The Commission’s recommendations recognize that the country’s laws and practices are not currently optimized to support the use of data for evidence building, nor in a manner that best protects privacy. To correct these problems, the Commission makes the following recommendations:

  • Establish a National Secure Data Service to facilitate access to data for evidence building while ensuring privacy and transparency in how those data are used. As a state-of-the-art resource for improving government’s capacity to use the data it already collects, the National Secure Data Service will be able to temporarily link existing data and provide secure access to those data for exclusively statistical purposes in connection with approved projects. The National Secure Data Service will do this without creating a data clearinghouse or warehouse.
  • Require stringent privacy qualifcations for acquiring and combining data for statistical purposes at the National Secure Data Service to ensure that data continue to be efectively protected while improving the government’s ability to understand the impacts of programs on a wider range of outcomes. At the same time, consider additional statutory changes to enable ongoing statistical production that, under the same stringent privacy qualifcations, may make use of combined data.
  • Review and, where needed, revise laws authorizing Federal data collection and use to ensure that limited access to administrative and survey data is possible to return benefts to the public through improved programs and policies, but only under strict privacy controls.
  • Ensure state-collected quarterly earnings data are available for statistical purposes, including to support the many evidence-building activities for which earnings are an important outcome.
  • Make additional state-collected data about Federal programs available for evidence building. Where appropriate, states that administer programs with substantial Federal investment should in return provide the data necessary for evidence building.
  • Develop a uniform process for external researchers to apply and qualify for secure access to confdential government data for evidence-building purposes while protecting privacy by carefully restricting data access to qualifed and approved researchers…(More)”

Big Data: A New Empiricism and its Epistemic and Socio-Political Consequences


Chapter by Gernot Rieder and Judith Simon in Berechenbarkeit der Welt?: “The paper investigates the rise of Big Data in contemporary society. It examines the most prominent epistemological claims made by Big Data proponents, calls attention to the potential socio-political consequences of blind data trust, and proposes a possible way forward. The paper’s main focus is on the interplay between an emerging new empiricism and an increasingly opaque algorithmic environment that challenges democratic demands for transparency and accountability. It concludes that a responsible culture of quantification requires epistemic vigilance as well as a greater awareness of the potential dangers and pitfalls of an ever more data-driven society….(More)”.

Want to make your vote really count? Stick a blockchain on it


Niall Firth at New Scientist: “Bitcoin changed the way we think about money forever. Now a type of political cryptocurrency wants to do the same for votes, reinventing how we participate in democracy.

Sovereign is being unveiled this week by Democracy Earth, a not-for-profit organisation in Palo Alto, California. It combines liquid democracy – which gives individuals more flexibility in how they use their votes – with blockchains, digital ledgers of transactions that keep cryptocurrencies like bitcoin secure. Sovereign’s developers hope it could signal the beginning of a democratic system that transcends national borders.

“There’s an intrinsic incompatibility between the internet and nation states,” says Santiago Siri, one of Democracy Earth’s co-founders. “If we’re going to think about digital governance, we need to think in a borderless, global way.”

 The basic concept of liquid democracy is that voters can express their wishes on an issue directly or delegate their vote to someone else they think is better-placed to decide on their behalf. In turn, those delegates can also pass those votes upwards through the chain. Crucially, users can see how their delegate voted and reclaim their vote to use themselves.

It’s an attractive concept, but it hasn’t been without problems. One is that a seemingly unending series of votes saps the motivation of users, so fewer votes are cast over time. Additionally, a few “celebrities” can garner an unhealthy number of delegated votes and wield too much power – an issue Germany’s Pirate Party ran into when experimenting with liquid democracy.

Siri thinks Sovereign can solve both of these problems. It sits on existing blockchain software platforms, such as Ethereum, but instead of producing units of cryptocurrency, Sovereign creates a finite number of tokens called “votes”. These are assigned to registered users who can vote as part of organisations who set themselves up on the network, whether that is a political party, a municipality, a country or even a co-operatively run company.

No knowledge of blockchains is required – voters simply use an app. Votes are then “dripped” into their accounts over time like a universal basic income of votes. Users can debate with each other before deciding which way to vote. A single vote takes just a tap, while more votes can be assigned to a single issue using a slider bar….(More)”

How are Italian Companies Embracing Open Data?


open-data-200-italy (1)Are companies embracing the use of open government data? How, why and what data is being leveraged? To answer these questions, the GovLab started a project three years ago, Open Data 500, to map and assess — in a comparative manner, across sectors and countries — the private sector’s use of open data to develop new products and services, and create social value.

Today we are launching Open Data 200 Italy, in partnership with Fondazione Bruno Kessler, which seeks to showcase the breadth and depth of companies using open data in Italy.

OD200 Italy is the first and only platform to map the use of open data by companies in Italy. 

Our findings show there is a growing ecosystem around open data in Italy that goes beyond traditional open data advocates. …

The OD200 Italy project shows the diversity of data being used, which makes it necessary to keep open data broad and sustained.

“The merits and use of open data for businesses are often praised but not supported by evidence. OD200 Italy is a great contribution to the evidence base of who, how and why corporations are leveraging open data,” said Stefaan Verhulst, Co-Founder of The GovLab and Chief Research and Development Officer. “Policy makers, practitioners and researchers can leverage the data generated by this initiative to improve the supply and use of open data, or to generate new insights. As such, OD200 Italy is a new open data set on open data.”…(More)”.

Plato and the Nerd. The Creative Partnership of Humans and Technology


MITPress: “In this book, Edward Ashford Lee makes a bold claim: that the creators of digital technology have an unsurpassed medium for creativity. Technology has advanced to the point where progress seems limited not by physical constraints but the human imagination. Writing for both literate technologists and numerate humanists, Lee makes a case for engineering—creating technology—as a deeply intellectual and fundamentally creative process. Explaining why digital technology has been so transformative and so liberating, Lee argues that the real power of technology stems from its partnership with humans.

Lee explores the ways that engineers use models and abstraction to build inventive artificial worlds and to give us things that we never dreamed of—for example, the ability to carry in our pockets everything humans have ever published. But he also attempts to counter the runaway enthusiasm of some technology boosters who claim everything in the physical world is a computation—that even such complex phenomena as human cognition are software operating on digital data. Lee argues that the evidence for this is weak, and the likelihood that nature has limited itself to processes that conform to today’s notion of digital computation is remote.

Lee goes on to argue that artificial intelligence’s goal of reproducing human cognitive functions in computers vastly underestimates the potential of computers. In his view, technology is coevolving with humans. It augments our cognitive and physical capabilities while we nurture, develop, and propagate the technology itself. Complementarity is more likely than competition….(More)”.

Artificial Intelligence and Public Policy


Paper by Adam D. ThiererAndrea Castillo and Raymond Russell: “There is growing interest in the market potential of artificial intelligence (AI) technologies and applications as well as in the potential risks that these technologies might pose. As a result, questions are being raised about the legal and regulatory governance of AI, machine learning, “autonomous” systems, and related robotic and data technologies. Fearing concerns about labor market effects, social inequality, and even physical harm, some have called for precautionary regulations that could have the effect of limiting AI development and deployment. In this paper, we recommend a different policy framework for AI technologies. At this nascent stage of AI technology development, we think a better case can be made for prudence, patience, and a continuing embrace of “permissionless innovation” as it pertains to modern digital technologies. Unless a compelling case can be made that a new invention will bring serious harm to society, innovation should be allowed to continue unabated, and problems, if they develop at all, can be addressed later…(More)”.

Chatbot helps asylum seekers prepare for their interviews


Springwise: “MarHub is a new chatbot developed by students at the University of California-Berkeley’s Haas School of Businessto help asylum seekers through the complicated process of applying to become an official refugee – which can take up to 18 months – and to avoid using smugglers.

Finding the right information for the asylum process isn’t easy, and although most asylum seekers are in possession of a smartphone, a lot of the information is either missing or out of date. MarHub is designed to help with that, as it will walk the user through what they can expect and also how to present their case. MarHub is also expandable, so that new information or regulations can be quickly added to make it a hub of useful information.

The concept of MarHub was born in late 2016, in response to the Hult Prize social enterprise challenge, which was focusing on refugees for 2017. The development team quickly realized that there was a gap in the market which they felt they could fill. MarHub will initially be made available through Facebook, and then later on WhatsApp and text messaging….(More)”.

Harnessing the Data Revolution to Achieve the Sustainable Development Goals


Erol Yayboke et al at CSIS: “Functioning societies collect accurate data and utilize the evidence to inform policy. The use of evidence derived from data in policymaking requires the capability to collect and analyze accurate data, clear administrative channels through which timely evidence is made available to decisionmakers, and the political will to rely on—and ideally share—the evidence. The collection of accurate and timely data, especially in the developing world, is often logistically difficult, not politically expedient, and/or expensive.

Before launching its second round of global goals—the Sustainable Development Goals (SDGs)—the United Nations convened a High-Level Panel of Eminent Persons on the Post-2015 Development Agenda. As part of its final report, the Panel called for a “data revolution” and recommended the formation of an independent body to lead the charge.1The report resulted in the creation of the Global Partnership for Sustainable Development Data (GPSDD)—an independent group of countries, companies, data communities, and NGOs—and the SDG Data Labs, a private initiative partnered with the GPSDD. In doing so the United Nations and its partners signaled broad interest in data and evidence-based policymaking at a high level. In fact, the GPSDD calls for the “revolution in data” by addressing the “crisis of non-existent, inaccessible or unreliable data.”As this report shows, this is easier said than done.

This report defines the data revolution as an unprecedented increase in the volume and types of data—and the subsequent demand for them—thanks to the ongoing yet uneven proliferation of new technologies. This revolution is allowing governments, companies, researchers, and citizens to monitor progress and drive action, often with real-time, dynamic, disaggregated data. Much work will be needed to make sure the data revolution reaches developing countries facing difficult challenges (i.e., before the data revolution fully becomes the data revolution for sustainable development). It is important to think of the revolution as a multistep process, beginning with building basic knowledge and awareness of the value of data. This is followed by a more specific focus on public private partnerships, opportunities, and constraints regarding collection and utilization of data for evidence-based policy decisions….

This report provides the following recommendations to the international community to play a constructive role in the data revolution:

  • Don’t fixate on big data alone. Focus on the foundation necessary to facilitate leapfrogs around all types of data: small, big, and everywhere in between.
  • Increase funding for capacity building as part of an expansion of broader educational development priorities.
  • Highlight, share, and support enlightened government-driven approaches to data.
  • Increase funding for the data revolution and coordinate donor efforts.
  • Coordinate UN data revolution-related activities closely with an expanded GPSDD.
  • Secure consensus on data sharing, ownership, and privacy-related international standards….(More)”.

MIT map offers real-time, crowd-sourced flood reporting during Hurricane Irma


MIT News: “As Hurricane Irma bears down on the U.S., the MIT Urban Risk Lab has launched a free, open-source platform that will help residents and government officials track flooding in Broward County, Florida. The platform, RiskMap.us, is being piloted to enable both residents and emergency managers to obtain better information on flooding conditions in near-real time.

Residents affected by flooding can add information to the publicly available map via popular social media channels. Using Twitter, Facebook, and Telegram, users submit reports by sending a direct message to the Risk Map chatbot. The chatbot replies to users with a one-time link through which they can upload information including location, flood depth, a photo, and description.

Residents and government officials can view the map to see recent flood reports to understand changing flood conditions across the county. Tomas Holderness, a research scientist in the MIT Department of Architecture, led the design of the system. “This project shows the importance that citizen data has to play in emergencies,” he says. “By connecting residents and emergency managers via social messaging, our map helps keep people informed and improve response times.”…

The Urban Risk Lab also piloted the system in Indonesia — where the project is called PetaBencana.id, or “Map Disaster” — during a large flood event on Feb. 20, 2017.

During the flooding, over 300,000 users visited the public website in 24 hours, and the map was integrated into the Uber application to help drivers avoid flood waters. The project in Indonesia is supported by a grant from USAID and is working in collaboration with the Indonesian Federal Emergency Management Agency, the Pacific Disaster Centre, and the Humanitarian Open Street Map Team.

The Urban Risk Lab team is also working in India on RiskMap.in….(More)”.

Feeding the Machine: Policing, Crime Data, & Algorithms


Elizabeth E. Joh at William & Mary Bill of Rights J. (2017 Forthcoming): “Discussions of predictive algorithms used by the police tend to assume the police are merely end users of big data. Accordingly, police departments are consumers and clients of big data — not much different than users of Spotify, Netflix, Amazon, or Facebook. Yet this assumption about big data policing contains a flaw. Police are not simply end users of big data. They generate the information that big data programs rely upon. This essay explains why predictive policing programs can’t be fully understood without an acknowledgment of the role police have in creating its inputs. Their choices, priorities, and even omissions become the inputs algorithms use to forecast crime. The filtered nature of crime data matters because these programs promise cutting edge results, but may deliver analyses with hidden limitations….(More)”.