As the Quantity of Data Explodes, Quality Matters


Article by Katherine Barrett and Richard Greene: “With advances in technology, governments across the world are increasingly using data to help inform their decision making. This has been one of the most important byproducts of the use of open data, which is “a philosophy- and increasingly a set of policies – that promotes transparency, accountability and value creation by making government data available to all,” according to the Organisation for Economic Co-operation and Development (OECD).

But as data has become ever more important to governments, the quality of that data has become an increasingly serious issue. A number of nations, including the United States, are taking steps to deal with it. For example, according to a study from Deloitte, “The Dutch government is raising the bar to enable better data quality and governance across the public sector.” In the same report, a case study about Finland states that “data needs to be shared at the right time and in the right way. It is also important to improve the quality and usability of government data to achieve the right goals.” And the United Kingdom has developed its Government Data Quality Hub to help public sector organizations “better identify their data challenges and opportunities and effectively plan targeted improvements.”

Our personal experience is with U.S. states and local governments, and in that arena the road toward higher quality data is a long and difficult one, particularly as the sheer quantity of data has grown exponentially. As things stand, based on our ongoing research into performance audits, it is clear that issues with data are impediments to the smooth process of state and local governments…(More)”.