As the Quantity of Data Explodes, Quality Matters


Article by Katherine Barrett and Richard Greene: “With advances in technology, governments across the world are increasingly using data to help inform their decision making. This has been one of the most important byproducts of the use of open data, which is “a philosophy- and increasingly a set of policies – that promotes transparency, accountability and value creation by making government data available to all,” according to the Organisation for Economic Co-operation and Development (OECD).

But as data has become ever more important to governments, the quality of that data has become an increasingly serious issue. A number of nations, including the United States, are taking steps to deal with it. For example, according to a study from Deloitte, “The Dutch government is raising the bar to enable better data quality and governance across the public sector.” In the same report, a case study about Finland states that “data needs to be shared at the right time and in the right way. It is also important to improve the quality and usability of government data to achieve the right goals.” And the United Kingdom has developed its Government Data Quality Hub to help public sector organizations “better identify their data challenges and opportunities and effectively plan targeted improvements.”

Our personal experience is with U.S. states and local governments, and in that arena the road toward higher quality data is a long and difficult one, particularly as the sheer quantity of data has grown exponentially. As things stand, based on our ongoing research into performance audits, it is clear that issues with data are impediments to the smooth process of state and local governments…(More)”.

Digital Equity 2.0: How to Close the Data Divide


Report by Gillian Diebold: “For the last decade, closing the digital divide, or the gap between those subscribing to broadband and those not subscribing, has been a top priority for policymakers. But high-speed Internet and computing device access are no longer the only barriers to fully participating and benefiting from the digital economy. Data is also increasingly essential, including in health care, financial services, and education. Like the digital divide, a gap has emerged between the data haves and the data have-nots, and this gap has introduced a new set of inequities: the data divide.

Policymakers have put a great deal of effort into closing the digital divide, and there is now near-universal acceptance of the notion that obtaining widespread Internet access generates social and economic benefits. But closing the data divide has received little attention. Moreover, efforts to improve data collection are typically overshadowed by privacy advocates’ warnings against collecting any data. In fact, unlike the digital divide, many ignore the data divide or argue that the way to close it is to collect vastly less data.1 But without substantial efforts to increase data representation and access, certain individuals and communities will be left behind in an increasingly data-driven world.

This report describes the multipronged efforts needed to address digital inequity. For the digital divide, policymakers have expanded digital connectivity, increased digital literacy, and improved access to digital devices. For the data divide, policymakers should similarly take a holistic approach, including by balancing privacy and data innovation, increasing data collection efforts across a wide array of fronts, enhancing access to data, improving data quality, and improving data analytics efforts. Applying lessons from the digital divide to this new challenge will help policymakers design effective and efficient policy and create a more equitable and effective data economy for all Americans…(More)”.

Civic Information Handbook


Handbook by Adrienne Goldstein: “Policymakers should update and enforce civil and human rights laws for the online environment, compel radical transparency, update consumer protection rules, insist that industry make a high-level commitment to democratic design, and create civic information infrastructure through a new PBS of the Internet. In the absence of such policy reform, amplifiers of civic information may never be able to beat out the well-resourced, well-networked groups that intentionally spread falsehoods. Nonetheless, there are strategies for helping civic information compete.

This handbook aims to:

  1. Educate civic information providers about coordinated deceptive campaigns

…including how they build their audiences, seed compelling narratives, amplify their messages, and activate their followers, as well as why false narratives take hold, and who the primary actors and targeted audiences are.

  1. Serve as a resource on how to flood the zone with trustworthy civic information

…namely, how civic information providers can repurpose the tactics used by coordinated deceptive campaigns in transparent, empowering ways and protect themselves and their message online.

This handbook will function as a media literacy tool, giving readers the skills and opportunity to consider who is behind networked information campaigns and how they spread their messages…(More)”.

Let’s Randomize America! 


Article by Dalton Conley: “…As our society has become less random, it has become more unequal. Many people know that inequality has been rising steadily over time, but a less-remarked-on development is that there’s been a parallel geographic shift, with high- and low-income people moving into separate, ever more distinct communities…As a sociologist, I study inequality and what can be done about it. It is, to say the least, a difficult problem to solve…I’ve come to believe that lotteries could help to crack this nut and make our society fairer and more equal. We can’t randomly assign where people live, of course. And we can’t integrate neighborhoods by fiat, either. We learned that lesson in the nineteen-seventies, when counties tried busing schoolchildren across town. Those programs aimed to create more racially and economically integrated schools; they resulted in the withdrawal of affluent students from urban public-school systems, and set off a political backlash that can still be felt today…

As a political tool, lotteries have come and gone throughout history. Sortition—the selection of political officials by lot—was first practiced in Athens in the sixth century B.C.E., and later reappeared in Renaissance city-states such as Florence, Venice, and Lombardy, and in Switzerland and elsewhere. In recent years, citizens’ councils—randomly chosen groups of individuals who meet to hammer out a particular issue, such as climate policy—have been tried in Canada, France, Iceland, Ireland, and the U.K. Some political theorists, such as Hélène Landemore, Jane Mansbridge, and the Belgian writer David Van Reybrouck, have argued that randomly selected decision-makers who don’t have to campaign are less likely to be corrupt or self-interested than those who must run for office; people chosen at random are also unlikely to be typically privileged, power-hungry politicians. The wisdom of the crowd improves when the crowd is more diverse…(More)”.

Machines of mind: The case for an AI-powered productivity boom


Report by Martin Neil Baily, Erik Brynjolfsson, Anton Korinek: “ Large language models such as ChatGPT are emerging as powerful tools that not only make workers more productive but also increase the rate of innovation, laying the foundation for a significant acceleration in economic growth. As a general purpose technology, AI will impact a wide array of industries, prompting investments in new skills, transforming business processes, and altering the nature of work. However, official statistics will only partially capture the boost in productivity because the output of knowledge workers is difficult to measure. The rapid advances can have great benefits but may also lead to significant risks, so it is crucial to ensure that we steer progress in a direction that benefits all of society…(More)”.

Data portability and interoperability: A primer on two policy tools for regulation of digitized industries


Article by Sukhi Gulati-Gilbert and Robert Seamans: “…In this article we describe two other tools, data portability and interoperability, that may be particularly useful in technology-enabled sectors. Data portability allows users to move data from one company to another, helping to reduce switching costs and providing rival firms with access to valuable customer data. Interoperability allows two or more technical systems to exchange data interactively. Due to its interactive nature, interoperability can help prevent lock-in to a specific platform by allowing users to connect across platforms. Data portability and interoperability share some similarities; in addition to potential pro-competitive benefits, the tools promote values of openness, transparency, and consumer choice.

After providing an overview of these topics, we describe the tradeoffs involved with implementing data portability and interoperability. While these policy tools offer lots of promise, in practice there can be many challenges involved when determining how to fund and design an implementation that is secure and intuitive and accomplishes the intended result.  These challenges require that policymakers think carefully about the initial implementation of data portability and interoperability. Finally, to better show how data portability and interoperability can increase competition in an industry, we discuss how they could be applied in the banking and social media sectors. These are just two examples of how data portability and interoperability policy could be applied to many different industries facing increased digitization. Our definitions and examples should be helpful to those interested in understanding the tradeoffs involved in using these tools to promote competition and innovation in the U.S. economy…(More)” See also: Data to Go: The Value of Data Portability as a Means to Data Liquidity.

The People and the Experts


Paper by William D. Nordhaus & Douglas Rivers: “Are speculators driving up oil prices? Should we raise energy prices to slow global warming? The present study takes a small number of such questions and compares the views of economic experts with those of the public. This comparison uses a panel of more than 2000 respondents from YouGov with the views of the panel of experts from the Initiative on Global Markets at the Chicago Booth School. We found that most of the US population is at best modestly informed about major economic questions and policies. The low level of knowledge is generally associated with the intrusion of ideological, political, and religious views that challenge or deny the current economic consensus. The intruding factors are highly heterogeneous across questions and sub-populations and are much more diverse than the narrowness of public political discourse would suggest. Many of these findings have been established for scientific subjects, but they appear to be equally important for economic views…(More)”.

Data Sharing Between Public and Private Sectors: When Local Governments Seek Information from the Sharing Economy.


Paper by the Centre for Information Policy Leadership: “…addresses the growing trend of localities requesting (and sometimes mandating) that data collected by the private sector be shared with the localities themselves. Such requests are generally not in the context of law enforcement or national security matters, but rather are part of an effort to further the public interest or promote a public good.

To the extent such requests are overly broad or not specifically tailored to the stated public interest, CIPL believes that the public sector’s adoption of accountability measures—which CIPL has repeatedly promoted for the private sector—can advance responsible data sharing practices between the two sectors. It can also strengthen the public’s confidence in data-driven initiatives that seek to improve their communities…(More)”.

Spamming democracy


Article by Natalie Alms: “The White House’s Office of Information and Regulatory Affairs is considering AI’s effect in the regulatory process, including the potential for generative chatbots to fuel mass campaigns or inject spam comments into the federal agency rulemaking process.

A recent executive order directed the office to consider using guidance or tools to address mass comments, computer-generated comments and falsely attributed comments, something an administration official told FCW that OIRA is “moving forward” on.

Mark Febrezio, a senior policy analyst at George Washington University’s Regulatory Studies Center, has experimented with Open AI’s generative AI system ChatGPT to create what he called a “convincing” public comment submission to a Labor Department proposal. 

“Generative AI also takes the possibility of mass and malattributed comments to the next level,” wrote Fabrizio and co-author Bridget Dooling, research professor at the center, in a paper published in April by the Brookings Institution.

The executive order comes years after astroturfing during the rollback of net neutrality policies by the Federal Communications Commission in 2017 garnered public attention. That rulemaking docket received a record-breaking 22 million-plus comments, but over 8.5 million came from a campaign against net neutrality led by broadband companies, according to an investigation by the New York Attorney General released in 2021. 

The investigation found that lead generators paid by these companies submitted many comments with real names and addresses attached without the knowledge or consent of those individuals.  In the same docket were over 7 million comments supporting net neutrality submitted by a computer science student, who used software to submit comments attached to computer-generated names and addresses.

While the numbers are staggering, experts told FCW that agencies aren’t just counting comments when reading through submissions from the public…(More)”

Unlocking the Power of Data Refineries for Social Impact


Essay by Jason Saul & Kriss Deiglmeier: “In 2021, US companies generated $2.77 trillion in profits—the largest ever recorded in history. This is a significant increase since 2000 when corporate profits totaled $786 billion. Social progress, on the other hand, shows a very different picture. From 2000 to 2021, progress on the United Nations Sustainable Development Goals has been anemic, registering less than 10 percent growth over 20 years.

What explains this massive split between the corporate and the social sectors? One explanation could be the role of data. In other words, companies are benefiting from a culture of using data to make decisions. Some refer to this as the “data divide”—the increasing gap between the use of data to maximize profit and the use of data to solve social problems…

Our theory is that there is something more systemic going on. Even if nonprofit practitioners and policy makers had the budget, capacity, and cultural appetite to use data; does the data they need even exist in the form they need it? We submit that the answer to this question is a resounding no. Usable data doesn’t yet exist for the sector because the sector lacks a fully functioning data ecosystem to create, analyze, and use data at the same level of effectiveness as the commercial sector…(More)”.