Access to New Data Sources for Statistics: Business Models and Incentives for the Corporate Sector


Screen Shot 2017-03-28 at 11.45.07 AMReport by Thilo Klein and Stefaan Verhulst: “New data sources, commonly referred to as “Big Data”, have attracted growing interest from National Statistical Institutes. They have the potential to complement official and more conventional statistics used, for instance, to determine progress towards the Sustainable Development Goals (SDGs) and other targets. However, it is often assumed that this type of data is readily available, which is not necessarily the case. This paper examines legal requirements and business incentives to obtain agreement on private data access, and more generally ways to facilitate the use of Big Data for statistical purposes. Using practical cases, the paper analyses the suitability of five generic data access models for different data sources and data uses in an emerging new data ecosystem. Concrete recommendations for policy action are presented in the conclusions….(More)”.

Open Data Maturity in Europe 2016


European Data Portal Report: “…the second in a series of annual studies and explores the level of Open Data Maturity in the EU28 and Norway, Switzerland and Liechtenstein – referred to as EU28+. The measurement is built on two key indicators Open Data Readiness and Portal Maturity, thereby covering the level of development of national activities promoting Open Data as well as the level of development of national portals. In 2016, with a 28.6% increase compared to 2015, the EU28+ countries completed over 55% of their Open Data journey showing that, by 2016, a majority of the EU28+ countries have successfully developed a basic approach to address Open Data. The Portal Maturity level increased by 22.6 percentage points from 41.7% to 64.3% thanks to the development of more advanced features on country data portals. The overall Open Data Maturity groups countries into different clusters: Beginners, Followers, Fast Trackers and Trend Setters. Barriers do remain to move Open Data forward. The report concludes on a series of recommendations, providing countries with guidance to further improve Open Data maturity. Countries need to raise more (political) awareness around Open Data, increase automated processes on their portals to increase usability and re-usability of data, and organise more events and trainings to support both local and national initiatives….(More)”.

Data Quality Tester


: “Publish What You Fund has launched a new online tool that allows aid and development finance publishers to independently check the quality of their data before they publish it to IATI. The aim of the Data Quality Tester – currently in Beta – is to indicate when information falls short of the specific data quality tests used to assess donors in the Aid Transparency Index. We expect it to be most useful for donors who are included in the Index to monitor their own progress both during and outside of the Index cycle.

Who is the Data Quality Tester for?

The Data Quality Tester is also suitable for organisations who want to start publishing in the IATI Standard and for those that do not qualify for inclusion in the Index, or that used to be assessed but are not currently. The open source online tool is useful because:

  • Both the IATI Standard and the Index tests can at times be complex and the tool allows a quick check against them, so donor agency staff can understand any issues
  • It allows publishers to internally and independently check the quality of their information before uploading to the IATI Registry, saving time and making sure that when data is uploaded, it is as good as it can be
  • It provides publishers with an opportunity to assess their data against the updated Index methodology and recognise where they need to improve

The tool is now live and available to use at:  http://dataqualitytester.publishwhatyoufund.org/…..(More)”

Book-Smart, Not Street-Smart: Blockchain-Based Smart Contracts and The Social Workings of Law


Paper by Karen E. C. Levy: “…critiques blockchain-based “smart contracts,” which aim to automatically and securely execute obligations without reliance on a centralized enforcement authority. Though smart contracts do have some features that might serve the goals of social justice and fairness, I suggest that they are based on a thin conception of what law does, and how it does it. Smart contracts focus on the technical form of contract to the exclusion of the social contexts within which contracts operate, and the complex ways in which people use them. In the real world, contractual obligations are enforced through all kinds of social mechanisms other than formal adjudication—and contracts serve many functions that are not explicitly legal in nature, or even designed to be formally enforced. I describe three categories of contracting practices in which people engage (the inclusion of facially unenforceable terms, the inclusion of purposefully underspecified terms, and willful nonenforcement of enforceable terms) to illustrate how contracts actually “work.” The technology of smart contracts neglects the fact that people use contracts as social resources to manage their relations. The inflexibility that they introduce, by design, might short-circuit a number of social uses to which law is routinely put. Therefore, I suggest that attention to the social and relational contexts of contracting are essential considerations for the discussion, development, and deployment of smart contracts….(More)”

Entrepreneurial Administration


Research Paper by Phil Weiser: “A core failing of today’s administrative state and modern administrative law scholarship is the lack of imagination as to how agencies should operate. On the conventional telling, public agencies follow specific grants of regulatory authority, use the traditional tools of notice-and-comment rulemaking and adjudication, and are checked by judicial review. In reality, however, effective administration depends on entrepreneurial leadership that spearheads policy experimentation and trial-and-error problem-solving, including the development of regulatory programs that use non-traditional tools.

Entrepreneurial administration takes place both at public agencies and private entities, each of which can address regulatory challenges and earn regulatory authority as a result. Consider, for example, that Energy Star, a successful program that has encouraged the manufacture and sale of energy efficient appliances, is developed and overseen by the Environmental Protection Agency (EPA). After the EPA established the program, Congress later codified it and, eventually, other countries followed suit. By contrast, the successful and complementary program encouraging the construction of energy efficient buildings, the well-respected LEED standard, is developed and overseen by a private organization. After it was developed, a number of governmental authorities endorsed it and have encouraged LEED-certified construction projects with both carrots and sticks. Significantly, while neither the Energy Star nor the LEED program were originally anticipated by any regulatory statute, both have had a tremendous impact.

The Energy Star and LEED case studies exemplify the sort of innovative regulatory strategies that are taking root in the modern administrative state. Despite the importance of entrepreneurial administration in practice, scholars have failed to examine the role of entrepreneurial leadership in spurring policy innovation and earning regulatory authority for an agency (or private entity). In short, administrative law needs a richer and more textured account of agency action, why entrepreneurial leadership matters in government, and how agencies should operate.

This Article explains that the conventional view of agency behavior — either following the specific direction of Congress or the President to use notice-and-comment rulemaking or adjudication processes — does not adequately portray how public agencies and private entities develop innovative regulatory strategies and earn regulatory authority as a result. In particular, this Article explains how governmental agencies like the EPA or private entities like the Green Building Council (which oversees the LEED standard) depend on entrepreneurial leadership to develop experimental regulatory strategies. It also explains how, in the wake of such experiments, legislative bodies have the opportunity to evaluate regulatory innovations in practice before deciding whether to embrace, revise, reject, or merely tolerate them.

This Article highlights the importance of entrepreneurial leadership in government, providing a number of examples of emerging regulatory experiments and suggesting how Congress should evaluate such experiments. This discussion explains how entrepreneurial leadership and a culture of experimentation and trial-and-error learning is necessary to develop innovative strategies and overcome the pressure to manage the status quo. In so doing, the Article underscores how policy entrepreneurship is integral to agency effectiveness, an important corrective to public choice theory, and a missing piece of modern administrative law scholarship….(More)”.

Avoiding Data Graveyards: Insights from Data Producers & Users in Three Countries


Report by Samantha Custer and Tanya Sethi: “Government, development partner, and civil society leaders make decisions every day about how to allocate, monitor and evaluate development assistance. Policymakers and practitioners can theoretically draw from more data sources in a variety of formats than ever before to inform these decisions,but will they choose to do so? Those who collect data and produce evidence are often far removed from those who ultimately influence and make decisions. Technocratic ideals of evidence-informed policymaking and data-driven decision-making are easily undercut by individual prerogatives, organizational imperatives, and ecosystem-wide blind spots.

In 2016, researchers from the AidData Center for Development Policy interviewed nearly 200 decision-makers and those that advise them in Honduras, Timor-Leste, and Senegal. Central government officials, development partner representatives based in country, and leaders of civil society organizations (CSOs) shared their experiences in producing and using data to target development projects, monitor progress, and evaluate results.

Specifically, the report answers three questions:

  • Who produces development data and statistics, for what purposes and for whom?
  • What are the the technical and political constraints for decision-makers to use development data in their work?
  • What can funders and producers do differently to encourage use of data and evidence in decision-making?

Using a theory of change, we identify nine barriers to the use of data and corresponding operating principles for funders and producers to make demand-driven investments in the next generation of development data and statistics….(More)”.

Scientific crowdsourcing in wildlife research and conservation: Tigers (Panthera tigris) as a case study


Özgün Emre Can, Neil D’Cruze, Margaret Balaskas, and David W. Macdonald in PLOS Biology: “With around 3,200 tigers (Panthera tigris) left in the wild, the governments of 13 tiger range countries recently declared that there is a need for innovation to aid tiger research and conservation. In response to this call, we created the “Think for Tigers” study to explore whether crowdsourcing has the potential to innovate the way researchers and practitioners monitor tigers in the wild. The study demonstrated that the benefits of crowdsourcing are not restricted only to harnessing the time, labor, and funds from the public but can also be used as a tool to harness creative thinking that can contribute to development of new research tools and approaches. Based on our experience, we make practical recommendations for designing a crowdsourcing initiative as a tool for generating ideas….(More)”

Geospatial big data and cartography: research challenges and opportunities for making maps that matter


International Journal Of Cartography; “Geospatial big data present a new set of challenges and opportunities for cartographic researchers in technical, methodological and artistic realms. New computational and technical paradigms for cartography are accompanying the rise of geospatial big data. Additionally, the art and science of cartography needs to focus its contemporary efforts on work that connects to outside disciplines and is grounded in problems that are important to humankind and its sustainability. Following the development of position papers and a collaborative workshop to craft consensus around key topics, this article presents a new cartographic research agenda focused on making maps that matter using geospatial big data. This agenda provides both long-term challenges that require significant attention and short-term opportunities that we believe could be addressed in more concentrated studies….(More)”.

Bit By Bit: Social Research in the Digital Age


Open Review of Book by Matthew J. Salganik: “In the summer of 2009, mobile phones were ringing all across Rwanda. In addition to the millions of calls between family, friends, and business associates, about 1,000 Rwandans received a call from Joshua Blumenstock and his colleagues. The researchers were studying wealth and poverty by conducting a survey of people who had been randomly sampled from a database of 1.5 million customers from Rwanda’s largest mobile phone provider. Blumenstock and colleagues asked the participants if they wanted to participate in a survey, explained the nature of the research to them, and then asked a series of questions about their demographic, social, and economic characteristics.

Everything I have said up until now makes this sound like a traditional social science survey. But, what comes next is not traditional, at least not yet. They used the survey data to train a machine learning model to predict someone’s wealth from their call data, and then they used this model to estimate the wealth of all 1.5 million customers. Next, they estimated the place of residence of all 1.5 million customers by using the geographic information embedded in the call logs. Putting these two estimates together—the estimated wealth and the estimated place of residence—Blumenstock and colleagues were able to produce high-resolution estimates of the geographic distribution of wealth across Rwanda. In particular, they could produce an estimated wealth for each of Rwanda’s 2,148 cells, the smallest administrative unit in the country.

It was impossible to validate these estimates because no one had ever produced estimates for such small geographic areas in Rwanda. But, when Blumenstock and colleagues aggregated their estimates to Rwanda’s 30 districts, they found that their estimates were similar to estimates from the Demographic and Health Survey, the gold standard of surveys in developing countries. Although these two approaches produced similar estimates in this case, the approach of Blumenstock and colleagues was about 10 times faster and 50 times cheaper than the traditional Demographic and Health Surveys. These dramatically faster and lower cost estimates create new possibilities for researchers, governments, and companies (Blumenstock, Cadamuro, and On 2015).

In addition to developing a new methodology, this study is kind of like a Rorschach inkblot test; what people see depends on their background. Many social scientists see a new measurement tool that can be used to test theories about economic development. Many data scientists see a cool new machine learning problem. Many business people see a powerful approach for unlocking value in the digital trace data that they have already collected. Many privacy advocates see a scary reminder that we live in a time of mass surveillance. Many policy makers see a way that new technology can help create a better world. In fact, this study is all of those things, and that is why it is a window into the future of social research….(More)”