Field experimenting in economics: Lessons learned for public policy


Robert Metcalfe at OUP Blog: “Do neighbourhoods matter to outcomes? Which classroom interventions improve educational attainment? How should we raise money to provide important and valued public goods? Do energy prices affect energy demand? How can we motivate people to become healthier, greener, and more cooperative? These are some of the most challenging questions policy-makers face. Academics have been trying to understand and uncover these important relationships for decades.

Many of the empirical tools available to economists to answer these questions do not allow causal relationships to be detected. Field experiments represent a relatively new methodological approach capable of measuring the causal links between variables. By overlaying carefully designed experimental treatments on real people performing tasks common to their daily lives, economists are able to answer interesting and policy-relevant questions that were previously intractable. Manipulation of market environments allows these economists to uncover the hidden motivations behind economic behaviour more generally. A central tenet of field experiments in the policy world is that governments should understand the actual behavioural responses of their citizens to changes in policies or interventions.

Field experiments represent a departure from laboratory experiments. Traditionally, laboratory experiments create experimental settings with tight control over the decision environment of undergraduate students. While these studies also allow researchers to make causal statements, policy-makers are often concerned subjects in these experiments may behave differently in settings where they know they are being observed or when they are permitted to sort out of the market.

For example, you might expect a college student to contribute more to charity when she is scrutinized in a professor’s lab than when she can avoid the ask altogether. Field experiments allow researchers to make these causal statements in a setting that is more generalizable to the behaviour policy-makers are directly interested in.

To date, policy-makers traditionally gather relevant information and data by using focus groups, qualitative evidence, or observational data without a way to identify causal mechanisms. It is quite easy to elicit people’s intentions about how they behave with respect to a new policy or intervention, but there is increasing evidence that people’s intentions are a poor guide to predicting their behaviour.

However, we are starting to see a small change in how governments seek to answer pertinent questions. For instance, the UK tax office (Her Majesty’s Revenue and Customs) now uses field experiments across some of its services to improve the efficacy of scarce taxpayers money. In the US, there are movements toward gathering more evidence from field experiments.

In the corporate world, experimenting is not new. Many of the current large online companies—such as Amazon, Facebook, Google, and Microsoft—are constantly using field experiments matched with big data to improve their products and deliver better services to their customers. More and more companies will use field experiments over time to help them better set prices, tailor advertising, provide a better customer journey to increase welfare, and employ more productive workers…(More).

See also Field Experiments in the Developed World: An Introduction (Oxford Review of Economic Policy)

CMS announces entrepreneurs and innovators to access Medicare data


Centers for Medicare and Medicaid Services Press Release: “…the acting Centers for Medicare & Medicaid Services (CMS) Administrator, Andy Slavitt, announced a new policy that for the first time will allow innovators and entrepreneurs to access CMS data, such as Medicare claims. As part of the Administration’s commitment to use of data and information to drive transformation of the healthcare delivery system, CMS will allow innovators and entrepreneurs to conduct approved research that will ultimately improve care and provide better tools that should benefit health care consumers through a greater understanding of what the data says works best in health care. The data will not allow the patient’s identity to be determined, but will provide the identity of the providers of care. CMS will begin accepting innovator research requests in September 2015.

“Data is the essential ingredient to building a better, smarter, healthier system. Today’s announcement is aimed directly at shaking up health care innovation and setting a new standard for data transparency,” said acting CMS Administrator Andy Slavitt. “We expect a stream of new tools for beneficiaries and care providers that improve care and personalize decision-making.”

Innovators and entrepreneurs will access data via the CMS Virtual Research Data Center (VRDC) which provides access to granular CMS program data, including Medicare fee-for-service claims data, in an efficient and cost effective manner. Researchers working in the CMS VRDC have direct access to approved privacy-protected data files and are able to conduct their analysis within a secure CMS environment….

Examples of tools or products that innovators and entrepreneurs might develop include care management or predictive modeling tools, which could greatly benefit the healthcare system, in the form of healthier people, better quality, or lower cost of care. Even though all data is privacy-protected, researchers also will not be allowed to remove patient-level data from the VRDC. They will only be able to download aggregated, privacy-protected reports and results to their own personal workstation.  …(More)”

New ODI research shows open data reaching every sector of UK industry


ODI: “New research has been published today (1 June) by the Open Data Institute showing that open data is reaching every sector of UK industry.

In various forms, open data is being adopted by a wide variety of businesses – small and large, new and old, from right across the country. The findings from Open data means business: UK innovation across sectors and regions draw on 270 companies with a combined turnover of £92bn and over 500k employees, identified by the ODI as using, producing or investing in open data as part of their business. The project included desk research, surveys and interviews on the companies’ experiences.

Key findings from the research include:

  • Companies using open data come from many sectors; over 46% from outside the information and communication sector. These include finance & insurance, science & technology, business administration & support, arts & entertainment, health, retail, transportation, education and energy.
  • The most popular datasets for companies aregeospatial/mapping data (57%), transport data (43%) and environment data (42%).
  • 39% of companies innovating with open data are over 10 years old, with some more than 25 years old, proving open data isn’t just for new digital startups.
  • ‘Micro-enterprises’ (businesses with fewer than 10 employees) represented 70% of survey respondents, demonstrating athriving open data startup scene. These businesses are using it to create services, products and platforms. 8% of respondents were drawn from large companies of 251 or more employees….
  • The companies surveyed listed 25 different government sources for the data they use. Notably, Ordnance Survey data was cited most frequently, by 14% of the companies. The non-government source most commonly used was OpenStreetMap, an openly licenced map of the world created by volunteers….(More)

Navigating the Health Data Ecosystem


New book on O’Reilly Media on “The “Six C’s”: Understanding the Health Data Terrain in the Era of Precision Medicine”: “Data-driven technologies are now being adopted, developed, funded, and deployed throughout the health care market at an unprecedented scale. But, as this O’Reilly report reveals, health care innovation contains more hurdles and requires more finesse than many tech startups expect. By paying attention to the lessons from the report’s findings, innovation teams can better anticipate what they’ll face, and plan accordingly.

Simply put, teams looking to apply collective intelligence and “big data” platforms to health and health care problems often don’t appreciate the messy details of using and making sense of data in the heavily regulated hospital IT environment. Download this report today and learn how it helps prepare startups in six areas:

  1. Complexity: An enormous domain with noisy data not designed for machine consumption
  2. Computing: Lack of standard, interoperable schema for documenting human health in a digital format
  3. Context: Lack of critical contextual metadata for interpreting health data
  4. Culture: Startup difficulties in hospital ecosystems: why innovation can be a two-edged sword
  5. Contracts: Navigating the IRB, HIPAA, and EULA frameworks
  6. Commerce: The problem of how digital health startups get paid

This report represents the initial findings of a study funded by a grant from the Robert Wood Johnson Foundation. Subsequent reports will explore the results of three deep-dive projects the team pursued during the study. (More)”

Montreal plans to become a Smart City with free WiFi and open data


Ian Hardy at MobileSyrup: “Earlier this month, the Coderre Administration announced the Montreal Action Plan that includes 70 projects that will turn Montreal into a “smart city.”

The total allocated budget of $23 million is broken down into 6 sections — listed below with the official description — and is targeted for completion by the end of 2017. Apart from ensuring a fast fiber network, “unleashing municipal data,” and the rollout of “intelligent transport systems” that will bring your real-time info on your subway/bus/car service, the city plans to deploy free WiFi.

According to the statement, Montreal will be deploying wireless access points in 750 locations to have facilitate free public WiFi. The larger idea is to “enhance the experience of citizens, boost tourism and accelerate economic development of Montreal.”…

1. Wi-Fi public: Deploy APs to extend coverage in the area, creating a harmonized experience and provide uniform performance across the network to enhance the experience of citizens, boost tourism and accelerate the economic development of Montreal.

2. Very high speed network, multiservice: Adopt a telecommunications policy, create one-stop telecommunications and urban integrate the telecommunications component in the charter of all major urban projects, so that all players in the Montreal community have access a fiber network at high speed and multi-service, that meets their current and future needs.

3. Economic Niche smart city: Create an environment facilitating the emergence of companies in the smart city economic niche, multiply the sources of innovation for solving urban problems and simplify doing business with the City, so that Montreal becoming a leader in innovation as smart city and accelerate economic development.

4. Intelligent Mobility: Make available all data on mobility in real time, implement intelligent transport systems, intermodal and integrated deployment and support solutions designed to inform users to optimize mobility users in real time on the entire territory.

5. Participatory democracy: Unleashing municipal data, information management and governance and adapt the means of citizen participation to make them accessible online, to improve access to the democratic process and consolidate the culture of transparency and accountability.

6. Digital Public Services: Making a maximum of services available on a multitude of digital channels, involve citizens in the development of services and create opportunities for all, to become familiar with their use, to provide access to municipal services 24/7, across multiple platforms….(More)”

Smart Cities, Smart Governments and Smart Citizens: A Brief Introduction


Paper by Gabriel Puron Cid et al in the International Journal of E-Planning Research (IJEPR): “Although the field of study surrounding the “smart city” is in an embryonic phase, the use of information and communication technologies (ICT) in urban settings is not new (Dameri and Rosenthal-Sabroux, 2014; Toh and Low, 1993; Tokmakoff and Billington, 1994). Since ancient times, cities and metropolitan areas have propelled social transformation and economic prosperity in many societies (Katz and Bradley, 2013). Many modern urban sites and metros have leveraged the success and competitiveness of ICTs (Caragliu, Del Bo and Nijkamp, 2011). At least in part, the recent growth of smart city initiatives can be attributed to the rapid adoption of mobile and sensor technologies, as well as the diversity of available Internet applications (Nam and Pardo, 2011; Oberti and Pavesi, 2013).

The effective use of technological innovations in urban sites has been embraced by the emergent term “smart city”, with a strong focus on improving living conditions, safeguarding the sustainability of the natural environment, and engaging with citizens more effectively and actively (Dameri and Rosenthal-Sabroux, 2014). Also known as smart city, digital city, or intelligent city, many of these initiatives have been introduced as strategies to improve the utilization of physical infrastructure (e.g., roads and utility grids), engage citizens in active local governance and decision making, foster sustainable growth, and help government officials learn and innovate as the environment changes….(More)”

The Hague Declaration on Knowledge Discovery in the Digital Age


The Hague Declaration: “New technologies are revolutionising the way humans can learn about the world and about themselves. These technologies are not only a means of dealing with Big Data1, they are also a key to knowledge discovery in the digital age; and their power is predicated on the increasing availability of data itself. Factors such as increasing computing power, the growth of the web, and governmental commitment to open access2 to publicly-funded research are serving to increase the availability of facts, data and ideas.

However, current legislative frameworks in different legal jurisdictions may not be cast in a way which supports the introduction of new approaches to undertaking research, in particular content mining. Content mining is the process of deriving information from machine-readable material. It works by copying large quantities of material, extracting the data, and recombining it to identify patterns and trends.

At the same time, intellectual property laws from a time well before the advent of the web limit the power of digital content analysis techniques such as text and data mining (for text and data) or content mining (for computer analysis of content in all formats)3. These factors are also creating inequalities in access to knowledge discovery in the digital age. The legislation in question might be copyright law, law governing patents or database laws – all of which may restrict the ability of the user to perform detailed content analysis.

Researchers should have the freedom to analyse and pursue intellectual curiosity without fear of monitoring or repercussions. These freedoms must not be eroded in the digital environment. Likewise, ethics around the use of data and content mining continue to evolve in response to changing technology.

Computer analysis of content in all formats, that is content mining, enables access to undiscovered public knowledge and provides important insights across every aspect of our economic, social and cultural life. Content mining will also have a profound impact for understanding society and societal movements (for example, predicting political uprisings, analysing demographical changes). Use of such techniques has the potential to revolutionise the way research is performed – both academic and commercial….(More: Declaration (PDF); Infographic)”

Data for Development


Jeffrey D. Sachs at Project Syndicate: “The data revolution is rapidly transforming every part of society. Elections are managed with biometrics, forests are monitored by satellite imagery, banking has migrated from branch offices to smartphones, and medical x-rays are examined halfway around the world. With a bit of investment and foresight, spelled out in a new report, prepared by the UN Sustainable Development Solutions Network (SDSN), on Data for Development, the data revolution can drive a sustainable development revolution, and accelerate progress toward ending poverty, promoting social inclusion, and protecting the environment.
The world’s governments will adopt the new Sustainable Development Goals (SDGs) at a special United Nations summit on September 25. The occasion will likely be the largest gathering of world leaders in history, as some 170 heads of state and government adopt shared goals that will guide global development efforts until 2030. Of course, goals are easier to adopt than to achieve. So we will need new tools, including new data systems, to turn the SDGs into reality by 2030. In developing these new data systems, governments, businesses, and civil-society groups should promote four distinct purposes.

The first, and most important, is data for service delivery. The data revolution gives governments and businesses new and greatly improved ways to deliver services, fight corruption, cut red tape, and guarantee access in previously isolated places. Information technology is already revolutionizing the delivery of health care, education, governance, infrastructure (for example, prepaid electricity), banking, emergency response, and much more.
The second purpose is data for public management. Officials can now maintain real-time dashboards informing them of the current state of government facilities, transport networks, emergency relief operations, public health surveillance, violent crimes, and much more. Citizen feedback can also improve functioning, such as by crowd-sourcing traffic information from drivers. Geographic information systems (GIS) allow for real-time monitoring across local governments and districts in far-flung regions.
The third purpose is data for accountability of governments and businesses. It is a truism that government bureaucracies cut corners, hide gaps in service delivery, exaggerate performance, or, in the worst cases, simply steal when they can get away with it. Many businesses are no better. The data revolution can help to ensure that verifiable data are accessible to the general public and the intended recipients of public and private services. When services do not arrive on schedule (owing to, say, a bottleneck in construction or corruption in the supply chain), the data system will enable the public to pinpoint problems and hold governments and businesses to account.
Finally, the data revolution should enable the public to know whether or not a global goal or target has actually been achieved. The Millennium Development Goals, which were set in the year 2000, established quantitative targets for the year 2015. But, although we are now in the MDGs’ final year, we still lack precise knowledge of whether certain MDG targets have been achieved, owing to the absence of high-quality, timely data. Some of the most important MDG targets are reported with a lag of several years. The World Bank, for example, has not published detailed poverty data since 2010…..(More)”

Quality of Public Administration – A Toolbox for Practitioners


European Commission: “The quality of its institutions, both governmental and judicial, is a key determining factor for a country’s economic and societal well-being. Administrative capacity is increasingly recognised as a pre-requisite for delivering the EU’s treaty obligations and objectives, such as creating sustainable growth and jobs. The EU supports Member States’ administrations through the European Semester process and the European Structural and Investment Funds (ESIF). The Toolbox aims to support, guide and encourage those who want to build public administrations that will create prosperous, fair and resilient societies. It is intended as a reference and resource, not a prescription or a panacea, by signposting readers to existing EU policies and international practices, illustrated by almost 170 inspirational case studies.

This abridged version of the Toolbox (the full e-version will be published soon at http://ec.europa.eu/esf/toolbox) sets the scene for readers, lays out principles and values of good governance, summarises the seven thematic chapters (policy-making, ethics and anti-corruption, institutions, service delivery, business environment, justice systems and public finance management), and sets out some considerations for managing the ESIF’s thematic objective 11….(This publication is available in printed format in English)

A new approach to measuring the impact of open data


 at SunLight Foundation: “Strong evidence on the long-term impact of open data initiatives is incredibly scarce. The lack of compelling proof is partly due to the relative novelty of the open government field, but also to the inherent difficulties in measuring good governance and social change. We know that much of the impact of policy advocacy, for instance, occurs even before a new law or policy is introduced, and is thus incredibly difficult to evaluate. At the same time, it is also very hard to detect the causality between a direct change in the legal environment and the specific activities of a policy advocacy group. Attribution is equally challenging when it comes to assessing behavioral changes – who gets to take credit for increased political engagement and greater participation in democratic processes?

Open government projects tend to operate in an environment where the contribution of other stakeholders and initiatives is essential to achieving sustainable change, making it even more difficult to show the causality between a project’s activities and the impact it strives to achieve. Therefore, these initiatives cannot be described through simple “cause and effect” relationships, as they mostly achieve changes through their contribution to outcomes produced by a complex ecosystem of stakeholders — including journalists, think tanks, civil society organizations, public officials and many more — making it even more challenging to measure their direct impact.

We at the Sunlight Foundation wanted to tackle some of the methodological challenges of the field through building an evidence base that can empower further generalizations and advocacy efforts, as well as developing a methodological framework to unpack theories of change and to evaluate the impact of open data and digital transparency initiatives. A few weeks ago, we presented our research at the Cartagena Data Festival, and today we are happy to launch the first edition of our paper, which you can read below or on Scribd.

The outputs of this research include:

  • A searchable repository of more than 100 examples on the outputs, outcomes and impacts of open data and digital technology projects;
  • Three distinctive theories of change for open data and digital transparency initiatives from the Global South;
  • A methodological framework to help develop more robust indicators of social and political change for the ecosystem of open data initiatives, by applying and revising the Outcome Mapping approach of IDRC to the field…(You can read the study at :The Social Impact of Open Data by juliakeseru)