The Unintended Consequences of Data Standardization


Article by Cathleen Clerkin: “The benefits of data standardization within the social sector—and indeed just about any industry—are multiple, important, and undeniable. Access to the same type of data over time lends the ability to track progress and increase accountability. For example, over the last 20 years, my organization, Candid, has tracked grantmaking by the largest foundations to assess changes in giving trends. The data allowed us to demonstrate philanthropy’s disinvestment in historically Black colleges and universities. Data standardization also creates opportunities for benchmarking—allowing individuals and organizations to assess how they stack up to their colleagues and competitors. Moreover, large amounts of standardized data can help predict trends in the sector. Finally—and perhaps most importantly to the social sector—data standardization invariably reduces the significant reporting burdens placed on nonprofits.

Yet, for all of its benefits, data is too often proposed as a universal cure that will allow us to unequivocally determine the success of social change programs and processes. The reality is far more complex and nuanced. Left unchecked, the unintended consequences of data standardization pose significant risks to achieving a more effective, efficient, and equitable social sector…(More)”.

Data Authenticity, Consent, and Provenance for AI Are All Broken: What Will It Take to Fix Them?


Article by Shayne Longpre et al: “New AI capabilities are owed in large part to massive, widely sourced, and underdocumented training data collections. Dubious collection practices have spurred crises in data transparency, authenticity, consent, privacy, representation, bias, copyright infringement, and the overall development of ethical and trustworthy AI systems. In response, AI regulation is emphasizing the need for training data transparency to understand AI model limitations. Based on a large-scale analysis of the AI training data landscape and existing solutions, we identify the missing infrastructure to facilitate responsible AI development practices. We explain why existing tools for data authenticity, consent, and documentation alone are unable to solve the core problems facing the AI community, and outline how policymakers, developers, and data creators can facilitate responsible AI development, through universal data provenance standards…(More)”.

Why data about people are so hard to govern


Paper by Wendy H. Wong, Jamie Duncan, and David A. Lake: “How data on individuals are gathered, analyzed, and stored remains largely ungoverned at both domestic and global levels. We address the unique governance problem posed by digital data to provide a framework for understanding why data governance remains elusive. Data are easily transferable and replicable, making them a useful tool. But this characteristic creates massive governance problems for all of us who want to have some agency and choice over how (or if) our data are collected and used. Moreover, data are co-created: individuals are the object from which data are culled by an interested party. Yet, any data point has a marginal value of close to zero and thus individuals have little bargaining power when it comes to negotiating with data collectors. Relatedly, data follow the rule of winner take all—the parties that have the most can leverage that data for greater accuracy and utility, leading to natural oligopolies. Finally, data’s value lies in combination with proprietary algorithms that analyze and predict the patterns. Given these characteristics, private governance solutions are ineffective. Public solutions will also likely be insufficient. The imbalance in market power between platforms that collect data and individuals will be reproduced in the political sphere. We conclude that some form of collective data governance is required. We examine the challenges to the data governance by looking a public effort, the EU’s General Data Protection Regulation, a private effort, Apple’s “privacy nutrition labels” in their App Store, and a collective effort, the First Nations Information Governance Centre in Canada…(More)”

Sludge Toolkit


About: “Sludge audits are a way to identify, quantify and remove sludge (unnecessary frictions) from government services. Using the NSW Government sludge audit method, you can

  • understand where sludge is making your government service difficult to access
  • quantify the impact of sludge on the community
  • know where and how you can improve your service using behavioural science
  • measure the impact of your service improvements…(More)”.

This City Pilots Web3 Quadratic Funding for Public Infrastructure


Article by Makoto Takahiro: “The city of Split, Croatia is piloting an innovative new system for deciding how to fund municipal infrastructure projects. Called “quadratic funding,” the mechanism aims to fairly account for both public and private preferences when allocating limited budget resources.

A coalition of organizations including BlockSplit, Funding the Commons, Gitcoin, and the City of Split launched the Municipal Quadratic Funding Initiative in September 2023. The project goals include implementing quadratic funding for prioritizing public spending, utilizing web3 tools to increase transparency and participation, and demonstrating the potential of these technologies to improve legacy processes.

If successful, the model could scale to other towns and cities or inspire additional quadratic funding experiments.

The partners believe that the transparency and configurability of blockchain systems make them well-suited to quadratic funding applications.

Quadratic funding mathematically accounts for the intensity of demand for public goods. Groups can create projects which individuals can support financially. The amount of money ultimately directed to each proposal is based on the square of support received. This means that projects attracting larger numbers of smaller contributions can compete with those receiving fewer large donations.

In this way, quadratic funding aims to reflect both willingness to pay and breadth of support in funding decisions. It attempts to break tendency towards corruption where influential groups lobby for their niche interests. The goal is a fairer allocation suited to the whole community’s preferences.

The initiative will build on open source quadratic funding infrastructure already deployed for other uses like funding public goods on Ethereum. Practical web3 tools can help teadministration manage funding rounds and disburse awards…(More)”.

Creating an Integrated System of Data and Statistics on Household Income, Consumption, and Wealth: Time to Build


Report by the National Academies: “Many federal agencies provide data and statistics on inequality and related aspects of household income, consumption, and wealth (ICW). However, because the information provided by these agencies is often produced using different concepts, underlying data, and methods, the resulting estimates of poverty, inequality, mean and median household income, consumption, and wealth, as well as other statistics, do not always tell a consistent or easily interpretable story. Measures also differ in their accuracy, timeliness, and relevance so that it is difficult to address such questions as the effects of the Great Recession on household finances or of the Covid-19 pandemic and the ensuing relief efforts on household income and consumption. The presence of multiple, sometimes conflicting statistics at best muddies the waters of policy debates and, at worst, enable advocates with different policy perspectives to cherry-pick their preferred set of estimates. Achieving an integrated system of relevant, high-quality, and transparent household ICW data and statistics should go far to reduce disagreement about who has how much, and from what sources. Further, such data are essential to advance research on economic wellbeing and to ensure that policies are well targeted to achieve societal goals…(More)”.

Designing an instrument for scaling public sector innovations


Paper by Mirte A R van Hout, Rik B Braams, Paul Meijer, and Albert J Meijer: “Governments worldwide invest in developing and diffusing innovations to deal with wicked problems. While experiments and pilots flourish, governments struggle to successfully scale innovations. Public sector scaling remains understudied, and scholarly suggestions for scaling trajectories are lacking. Following a design approach, this research develops an academically grounded, practice-oriented scaling instrument for planning and reflecting on the scaling of public sector innovations. We design this instrument based on the academic literature, an empirical analysis of three scaling projects at the Dutch Ministry of Infrastructure and Water Management, and six focus groups with practitioners. This research proposes a context-specific and iterative understanding of scaling processes and contributes a typology of scaling barriers and an additional scaling strategy to the literature. The presented instrument increases our academic understanding of scaling and enables teams of policymakers, in cooperation with stakeholders, to plan and reflect on a context-specific scaling pathway for public sector innovations…(More)”.

Digital transformation of public services


Policy Brief by Interreg Europe: “In a world of digital advancements, the public sector must undergo a comprehensive digital transformation to enhance service delivery efficiency, improve governance, foster innovation and increase citizen satisfaction.

The European Union is playing a leading role and has been actively developing policy frameworks for the digitalisation of the public sector. This policy brief provides a general overview of the most relevant initiatives, regulations, and strategies of the European Union, which are shaping Europe’s digital future.

The European Union’s strategy for the digital transformation of public services is centred on enhancing accessibility, efficiency, and user-centricity. This strategy also promotes interoperability among Member States, fostering seamless cross-border interactions. Privacy and security measures are integral to building trust in digital public services, with a focus on data protection and cybersecurity. Ultimately, the goal is to create a cohesive, digitally advanced public service ecosystem throughout the EU, with the active participation of the private sector (GovTech).

This policy brief outlines key policy improvements, good practices and recommendations, stemming from the Interreg Europe projects BEST DIHBETTERENAIBLERNext2MetDigital RegionsDigitourismInno ProvementERUDITE, iBuy and Carpe Digem, to inform and guide policymakers to embark upon digital transformation processes successfully, as well as encouraging greater interregional cooperation…(More)”.

Objectivity vs affect: how competing forms of legitimacy can polarize public debate in data-driven public consultation


Paper by Alison Powell: “How do data and objectivity become politicized? How do processes intended to include citizen voices instead push them into social media that intensify negative expression? This paper examines the possibility and limits of ‘agonistic data practices’ (Crooks & Currie, 2021) examining how data-driven consultation practices create competing forms of legitimacy for quantifiable knowledge and affective lived experience. Drawing on a two-year study of a private Facebook group self-presenting as a supportive space for working-class people critical of the development of ‘low-traffic neighbourhoods’ (LTNs), the paper reveals how the dynamics of ‘affective polarization’ associated the use of data with elite and exclusionary politics. Participants addressed this by framing their online contributions as ‘vernacular data’ and also by associating numerical data with exclusion and inequality. Over time the strong statements of feeling began to support content of a conspiratorial nature, reflected at the social level of discourse in the broader media environment where stories of strong feeling gain legitimacy in right-wing sources. The paper concludes that ideologies of dataism and practices of datafication may create conditions for political extremism to develop when the potential conditions of ‘agonistic data practices’ are not met, and that consultation processes must avoid overly valorizing data and calculable knowledge if they wish to retain democratic accountability…(More)”.

AI and the Future of Government: Unexpected Effects and Critical Challenges


Policy Brief by Tiago C. Peixoto, Otaviano Canuto, and Luke Jordan: “Based on observable facts, this policy paper explores some of the less- acknowledged yet critically important ways in which artificial intelligence (AI) may affect the public sector and its role. Our focus is on those areas where AI’s influence might be understated currently, but where it has substantial implications for future government policies and actions.

We identify four main areas of impact that could redefine the public sector role, require new answers from it, or both. These areas are the emergence of a new language-based digital divide, jobs displacement in the public administration, disruptions in revenue mobilization, and declining government responsiveness.

This discussion not only identifies critical areas but also underscores the importance of transcending conventional approaches in tackling them. As we examine these challenges, we shed light on their significance, seeking to inform policymakers and stakeholders about the nuanced ways in which AI may quietly, yet profoundly, alter the public sector landscape…(More)”.