Open Data Maturity in Europe 2016


European Data Portal Report: “…the second in a series of annual studies and explores the level of Open Data Maturity in the EU28 and Norway, Switzerland and Liechtenstein – referred to as EU28+. The measurement is built on two key indicators Open Data Readiness and Portal Maturity, thereby covering the level of development of national activities promoting Open Data as well as the level of development of national portals. In 2016, with a 28.6% increase compared to 2015, the EU28+ countries completed over 55% of their Open Data journey showing that, by 2016, a majority of the EU28+ countries have successfully developed a basic approach to address Open Data. The Portal Maturity level increased by 22.6 percentage points from 41.7% to 64.3% thanks to the development of more advanced features on country data portals. The overall Open Data Maturity groups countries into different clusters: Beginners, Followers, Fast Trackers and Trend Setters. Barriers do remain to move Open Data forward. The report concludes on a series of recommendations, providing countries with guidance to further improve Open Data maturity. Countries need to raise more (political) awareness around Open Data, increase automated processes on their portals to increase usability and re-usability of data, and organise more events and trainings to support both local and national initiatives….(More)”.

Standard Business Reporting: Open Data to Cut Compliance Costs


Report by the Data Foundation: “Imagine if U.S. companies’ compliance costs could be reduced, by billions of dollars. Imagine if this could happen without sacrificing any transparency to investors and governments. Open data can make that possible.

This first-ever research report, co-published by the Data Foundation and PwC, explains how Standard Business Reporting (SBR), in which multiple regulatory agencies adopt a common open data structure for the information they collect, reduces costs for both companies and agencies.

SBR programs are in place in the Netherlands, Australia, and elsewhere – but the concept is unknown in the United States. Our report is intended to introduce SBR to U.S. policymakers and lay the groundwork for future change….(More)”.

Software used to predict crime can now be scoured for bias


Dave Gershgorn in Quartz: “Predictive policing, or the idea that software can foresee where crime will take place, is being adopted across the country—despite being riddled with issues. These algorithms have been shown to disproportionately target minorities, and private companies won’t reveal how their software reached those conclusions.

In an attempt to stand out from the pack, predictive-policing startup CivicScape has released its algorithm and data online for experts to scour, according to Government Technology magazine. The company’s Github page is already populated with its code, as well as a variety of documents detailing how its algorithm interprets police data and what variables are included when predicting crime.

“By making our code and data open-source, we are inviting feedback and conversation about CivicScape in the belief that many eyes make our tools better for all,” the company writes on Github. “We must understand and measure bias in crime data that can result in disparate public safety outcomes within a community.”…

CivicScape claims to not use race or ethnic data to make predictions, although it is aware of other indirect indicators of race that could bias its software. The software also filters out low-level drug crimes, which have been found to be heavily biased against African Americans.

While this startup might be the first to publicly reveal the inner machinations of its algorithm and data practices, it’s not an assurance that predictive policing can be made fair and transparent across the board.

“Lots of research is going on about how algorithms can be transparent, accountable, and fair,” the company writes. “We look forward to being involved in this important conversation.”…(More)”.

Civic Tech & GovTech: An Overlooked Lucrative Opportunity for Technology Startups


Elena Mesropyan at LTP: “Civic technology, or Civic Tech, is defined as a technology that enables greater participation in government or otherwise assists government in delivering citizen services and strengthening ties with the public. In other words, Civic Tech is where the public lends its talents, usually voluntarily, to help government do a better job. Moreover, Omidyar Network(which invested over $90 million across 35 civic tech organizations over the past decade) emphasizes that like a movement, civic tech is mission-driven, focused on making a change that benefits the public, and in most cases enables better public input into decision making.

As an emerging sector, Civic Tech is defined as incorporating any technology that is used to empower citizens or help make government more accessible, efficient, and effective. Civic tech isn’t just talk, Omidyar notes, it is a community of people coming together to create tangible projects and take action. The civic tech and open data movements have grown with the ubiquity of personal technology.

Civic tech can be defined as a convergence of various fields. An example of such convergence has been given by Knight Foundation, a national foundation with a goal to foster informed and engaged communities to power a healthy democracy:

Civic Tech & GovTech: An Overlooked Lucrative Opportunity for Technology Startups

Source: The Emergence of Civic Tech: Investments in a Growing Field

In the report called Engines of Change: What Civic Tech Can Learn From Social Movements, Civic Tech is divided into three categories:

  • Citizen to Citizen (C2C): Technology that improves citizen mobilization or improves connections between citizens
  • Citizen to Government (C2G): Technology that improves the frequency or quality of interaction between citizens and government
  • Government Technology (Govtech): Innovative technology solutions that make government more efficient and effective at service delivery

In 2015, Forbes reported that Civic Tech makes up almost a quarter of local and state government spendings on technology….

Civic tech initiatives address a diverse range of industries – from energy and payments to agriculture and telecommunications. Mattermark outlines the following top ten industries associated with government and civic tech:

…There are certainly much more examples of GovTech/civic tech companies, and just tech startups offering solutions across the board that can significantly improve the way governments are run, and services are delivered to citizens and businesses. More importantly, GovTech should no longer be considered a charity and solely non-profit type of venture. Recently reviewed global P2G payments flows only, for example, are estimated to be at $7.7 trillion and represent a significant feature of the global payments landscape. For the low- and lower-middle-income countries alone, the number hits $375 billion (~50% of annual government expenditure)….(More)”

Seeing Theory


Seeing Theory is a project designed and created by Daniel Kunin with support from Brown University’s Royce Fellowship Program. The goal of the project is to make statistics more accessible to a wider range of students through interactive visualizations.

Statistics is quickly becoming the most important and multi-disciplinary field of mathematics. According to the American Statistical Association, “statistician” is one of the top ten fastest-growing occupations and statistics is one of the fastest-growing bachelor degrees. Statistical literacy is essential to our data driven society. Yet, for all the increased importance and demand for statistical competence, the pedagogical approaches in statistics have barely changed. Using Mike Bostock’s data visualization software, D3.js, Seeing Theory visualizes the fundamental concepts covered in an introductory college statistics or Advanced Placement statistics class. Students are encouraged to use Seeing Theory as an additional resource to their textbook, professor and peers….(More)”

Tactical Data Engagement guide


and  at the Sunlight Foundation: “United States cities face a critical challenge when it comes to fulfilling the potential of open data: that of moving beyond the mere provision of access to data toward the active facilitation of stakeholder use of data in ways that bring about community impact. Sunlight has been researching innovative projects and strategies that have helped cities tackle this challenge head on. Today we’re excited to share a guide for our new approach to open data in U.S. cities–an approach we’re calling “Tactical Data Engagement,” designed to drive community impact by connecting the dots between open data, public stakeholders, and collaborative action.

Access is critical but we have more work to do

Many city leaders have realized that open data is a valuable innovation to bring to city hall, and have invoked the promise of a new kind of relationship between government and the people: one where government works with the public in new collaborative ways. City mayors, managers, council members, and other leaders are making commitments to this idea in the US, with over 60 US cities having adopted open data reforms since 2006, nearly 20 in 2016 alone–many with the help of the Sunlight team as part of our support of the What Works Cities initiative. While cities are building the public policy infrastructure for open data, they are also making technical advancements as municipal IT and innovation departments build or procure new open data portals and release more and more government datasets proactively online….

However, … these developments alone are not enough. Portals and policies are critical infrastructure for the data-driven open government needed in the 21st century; but there has been and continues to be a disconnect between the rhetoric and promise of open data when compared to what it has meant in terms of practical reform. Let us be clear: the promise of open data is not about data on a website. The promise is for a new kind of relationship between government and the governed, one that brings about collaborative opportunities for impact. While many reforms have been successful in building an infrastructure of access, many have fallen short in leveraging that infrastructure for empowering residents and driving community change.

Announcing Tactical Data Engagement

In order to formulate an approach to help cities go further with their open data programs, Sunlight has been conducting an extensive review of the relevant literature on open data impact, and of the literature on approaches to community stakeholder engagement and co-creation (both civic-tech or open-data driven as well as more traditional)….

The result so far is our “Tactical Data Engagement” Guide (still in beta) designed to address what we see as the the most critical challenge currently facing the open data movement: helping city open data programs build on a new infrastructure of access to facilitate the collaborative use of open data to empower residents and create tangible community impact…(More)”

Open Data Privacy Playbook


A data privacy playbook by Ben Green, Gabe Cunningham, Ariel Ekblaw, Paul Kominers, Andrew Linzer, and Susan Crawford: “Cities today collect and store a wide range of data that may contain sensitive or identifiable information about residents. As cities embrace open data initiatives, more of this information is available to the public. While releasing data has many important benefits, sharing data comes with inherent risks to individual privacy: released data can reveal information about individuals that would otherwise not be public knowledge. In recent years, open data such as taxi trips, voter registration files, and police records have revealed information that many believe should not be released.

Effective data governance is a prerequisite for successful open data programs. The goal of this document is to codify responsible privacy-protective approaches and processes that could be adopted by cities and other government organizations that are publicly releasing data. Our report is organized around four recommendations:

  • Conduct risk-benefit analyses to inform the design and implementation of open data programs.
  • Consider privacy at each stage of the data lifecycle: collect, maintain, release, delete.
  • Develop operational structures and processes that codify privacy management widely throughout the City.
  • Emphasize public engagement and public priorities as essential aspects of data management programs.

Each chapter of this report is dedicated to one of these four recommendations, and provides fundamental context along with specific suggestions to carry them out. In particular, we provide case studies of best practices from numerous cities and a set of forms and tactics for cities to implement our recommendations. The Appendix synthesizes key elements of the report into an Open Data Privacy Toolkit that cities can use to manage privacy when releasing data….(More)”

Connecting the dots: Building the case for open data to fight corruption


Web Foundation: “This research, published with Transparency International, measures the progress made by 5 key countries in implementing the G20 Anti-Corruption Open Data Principles.

These principles, adopted by G20 countries in 2015, committed countries to increasing and improving the publication of public information, driving forward open data as a tool in anti-corruption efforts.

However, this research – looking at Brazil, France, Germany, Indonesia and South Africa – finds a disappointing lack of progress. No country studied has released all the datasets identified as being key to anti-corruption and much of the information is hard to find and hard use.

Key findings:

  • No country released all anti-corruption datasets
  • Quality issues means data is often not useful or useable
  • Much of the data is not published in line with open data standards, making comparability difficult
  • In many countries there is a lack of open data skills among officials in charge of anti-corruption initiatives

Download the overview report here (PDF), and access the individual country case studies BrazilFranceGermanyIndonesia and South Africa… (More)”

Data Disrupts Corruption


Carlos Santiso & Ben Roseth at Stanford Social Innovation Review: “…The Panama Papers scandal demonstrates the power of data analytics to uncover corruption in a world flooded with terabytes needing only the computing capacity to make sense of it all. The Rousse impeachment illustrates how open data can be used to bring leaders to account. Together, these stories show how data, both “big” and “open,” is driving the fight against corruption with fast-paced, evidence-driven, crowd-sourced efforts. Open data can put vast quantities of information into the hands of countless watchdogs and whistleblowers. Big data can turn that information into insight, making corruption easier to identify, trace, and predict. To realize the movement’s full potential, technologists, activists, officials, and citizens must redouble their efforts to integrate data analytics into policy making and government institutions….

Making big data open cannot, in itself, drive anticorruption efforts. “Without analytics,” a 2014 White House report on big data and individual privacy underscored, “big datasets could be stored, and they could be retrieved, wholly or selectively. But what comes out would be exactly what went in.”

In this context, it is useful to distinguish the four main stages of data analytics to illustrate its potential in the global fight against corruption: Descriptive analytics uses data to describe what has happened in analyzing complex policy issues; diagnostic analytics goes a step further by mining and triangulating data to explain why a specific policy problem has happened, identify its root causes, and decipher underlying structural trends; predictive analytics uses data and algorithms to predict what is most likely to occur, by utilizing machine learning; and prescriptive analytics proposes what should be done to cause or prevent something from happening….

Despite the big data movement’s promise for fighting corruption, many challenges remain. The smart use of open and big data should focus not only on uncovering corruption, but also on better understanding its underlying causes and preventing its recurrence. Anticorruption analytics cannot exist in a vacuum; it must fit in a strategic institutional framework that starts with quality information and leads to reform. Even the most sophisticated technologies and data innovations cannot prevent what French novelist Théophile Gautier described as the “inexplicable attraction of corruption, even amongst the most honest souls.” Unless it is harnessed for improvements in governance and institutions, data analytics will not have the impact that it could, nor be sustainable in the long run…(More)”.

Big and open data are prompting a reform of scientific governance


Sabina Leonelli in Times Higher Education: “Big data are widely seen as a game-changer in scientific research, promising new and efficient ways to produce knowledge. And yet, large and diverse data collections are nothing new – they have long existed in fields such as meteorology, astronomy and natural history.

What, then, is all the fuss about? In my recent book, I argue that the true revolution is in the status accorded to data as research outputs in their own right. Along with this has come an emphasis on open data as crucial to excellent and reliable science.

Previously – ever since scientific journals emerged in the 17th century – data were private tools, owned by the scientists who produced them and scrutinised by a small circle of experts. Their usefulness lay in their function as evidence for a given hypothesis. This perception has shifted dramatically in the past decade. Increasingly, data are research components that can and should be made publicly available and usable.

Rather than the birth of a data-driven epistemology, we are thus witnessing the rise of a data-centric approach in which efforts to mobilise, integrate and visualise data become contributions to discovery, not just a by-product of hypothesis testing.

The rise of data-centrism highlights the challenges involved in gathering, classifying and interpreting data, and the concepts, technologies and social structures that surround these processes. This has implications for how research is conducted, organised, governed and assessed.

Data-centric science requires shifts in the rewards and incentives provided to those who produce, curate and analyse data. This challenges established hierarchies: laboratory technicians, librarians and database managers turn out to have crucial skills, subverting the common view of their jobs as marginal to knowledge production. Ideas of research excellence are also being challenged. Data management is increasingly recognised as crucial to the sustainability and impact of research, and national funders are moving away from citation counts and impact factors in evaluations.

New uses of data are forcing publishers to re-assess their business models and dissemination procedures, and research institutions are struggling to adapt their management and administration.

Data-centric science is emerging in concert with calls for increased openness in research….(More)”