OECD Policy Paper: “This policy paper aims to help governments develop regulatory experimentation constructively and appropriately as part of their implementation of the 2021 OECD Recommendation for Agile Regulatory Governance to Harness Innovation. Regulatory experimentation can help promote adaptive learning and innovative and better-informed regulatory policies and practices. This policy paper examines key concepts, definitions and constitutive elements of regulatory experimentation. It outlines the rationale for using regulatory experimentation, discusses enabling factors and governance requirements, and presents a set of forward-looking conclusions…(More)”.
Plurality: The Future of Collaborative Technology and Democracy
Book by E. Glen Weyl, Audrey Tang and ⿻ Community: “Technology and democracy today are at odds: technology reinforces authoritarian oversight and corrupts democratic institutions, while democracies fight back with restrictive regulation and public sector conservatism. However, this conflict is not inevitable. This is the consequence of choosing to invest in technologies such as AI and cryptocurrencies at the expense of democratic principles. In some places, such as the Ether community, Estonia, Colorado, and especially Taiwan, the focus has shifted to technologies that promote pluralistic collaboration, and have witnessed the co-prosperity of both democracy and technology. Written by the paradigm leaders of the Plurality, this book shows for the first time how every technologist, policymaker, business leader, and activist can use it to build a more collaborative, diverse, and productive democratic world.
When Uber arrived in Taiwan, it sparked a lot of controversy, as it has in most parts of the world. But instead of fueling the controversy, social media, with the help of vTaiwan, a platform developed with the help of cabinet ministers, encouraged citizens to share their feelings and engage in deep conversations with thousands of participants to brainstorm how to regulate online ride-hailing services. The technology, which uses statistical tools often associated with AI to aggregate opinions, allows each participant to quickly view a clear representation of all people’s viewpoints and provide feedback on their own thoughts. From the outset, a broadly supported viewpoint is brought to the forefront among a diverse group of people with different perspectives, creating a rough consensus that ensures the benefits of this new form of ridesharing while protecting the rights of the drivers, and is implemented by the government. This process has been used in Taiwan to solve dozens of controversial problems and has quickly spread to governments, cooperatives, and blockchain communities around the world…(More)”.
The Unintended Consequences of Data Standardization
Article by Cathleen Clerkin: “The benefits of data standardization within the social sector—and indeed just about any industry—are multiple, important, and undeniable. Access to the same type of data over time lends the ability to track progress and increase accountability. For example, over the last 20 years, my organization, Candid, has tracked grantmaking by the largest foundations to assess changes in giving trends. The data allowed us to demonstrate philanthropy’s disinvestment in historically Black colleges and universities. Data standardization also creates opportunities for benchmarking—allowing individuals and organizations to assess how they stack up to their colleagues and competitors. Moreover, large amounts of standardized data can help predict trends in the sector. Finally—and perhaps most importantly to the social sector—data standardization invariably reduces the significant reporting burdens placed on nonprofits.
Yet, for all of its benefits, data is too often proposed as a universal cure that will allow us to unequivocally determine the success of social change programs and processes. The reality is far more complex and nuanced. Left unchecked, the unintended consequences of data standardization pose significant risks to achieving a more effective, efficient, and equitable social sector…(More)”.
Sludge Toolkit
About: “Sludge audits are a way to identify, quantify and remove sludge (unnecessary frictions) from government services. Using the NSW Government sludge audit method, you can
- understand where sludge is making your government service difficult to access
- quantify the impact of sludge on the community
- know where and how you can improve your service using behavioural science
- measure the impact of your service improvements…(More)”.
Creating an Integrated System of Data and Statistics on Household Income, Consumption, and Wealth: Time to Build
Report by the National Academies: “Many federal agencies provide data and statistics on inequality and related aspects of household income, consumption, and wealth (ICW). However, because the information provided by these agencies is often produced using different concepts, underlying data, and methods, the resulting estimates of poverty, inequality, mean and median household income, consumption, and wealth, as well as other statistics, do not always tell a consistent or easily interpretable story. Measures also differ in their accuracy, timeliness, and relevance so that it is difficult to address such questions as the effects of the Great Recession on household finances or of the Covid-19 pandemic and the ensuing relief efforts on household income and consumption. The presence of multiple, sometimes conflicting statistics at best muddies the waters of policy debates and, at worst, enable advocates with different policy perspectives to cherry-pick their preferred set of estimates. Achieving an integrated system of relevant, high-quality, and transparent household ICW data and statistics should go far to reduce disagreement about who has how much, and from what sources. Further, such data are essential to advance research on economic wellbeing and to ensure that policies are well targeted to achieve societal goals…(More)”.
Designing an instrument for scaling public sector innovations
Paper by Mirte A R van Hout, Rik B Braams, Paul Meijer, and Albert J Meijer: “Governments worldwide invest in developing and diffusing innovations to deal with wicked problems. While experiments and pilots flourish, governments struggle to successfully scale innovations. Public sector scaling remains understudied, and scholarly suggestions for scaling trajectories are lacking. Following a design approach, this research develops an academically grounded, practice-oriented scaling instrument for planning and reflecting on the scaling of public sector innovations. We design this instrument based on the academic literature, an empirical analysis of three scaling projects at the Dutch Ministry of Infrastructure and Water Management, and six focus groups with practitioners. This research proposes a context-specific and iterative understanding of scaling processes and contributes a typology of scaling barriers and an additional scaling strategy to the literature. The presented instrument increases our academic understanding of scaling and enables teams of policymakers, in cooperation with stakeholders, to plan and reflect on a context-specific scaling pathway for public sector innovations…(More)”.
Objectivity vs affect: how competing forms of legitimacy can polarize public debate in data-driven public consultation
Paper by Alison Powell: “How do data and objectivity become politicized? How do processes intended to include citizen voices instead push them into social media that intensify negative expression? This paper examines the possibility and limits of ‘agonistic data practices’ (Crooks & Currie, 2021) examining how data-driven consultation practices create competing forms of legitimacy for quantifiable knowledge and affective lived experience. Drawing on a two-year study of a private Facebook group self-presenting as a supportive space for working-class people critical of the development of ‘low-traffic neighbourhoods’ (LTNs), the paper reveals how the dynamics of ‘affective polarization’ associated the use of data with elite and exclusionary politics. Participants addressed this by framing their online contributions as ‘vernacular data’ and also by associating numerical data with exclusion and inequality. Over time the strong statements of feeling began to support content of a conspiratorial nature, reflected at the social level of discourse in the broader media environment where stories of strong feeling gain legitimacy in right-wing sources. The paper concludes that ideologies of dataism and practices of datafication may create conditions for political extremism to develop when the potential conditions of ‘agonistic data practices’ are not met, and that consultation processes must avoid overly valorizing data and calculable knowledge if they wish to retain democratic accountability…(More)”.
AI for Good: Applications in Sustainability, Humanitarian Action, and Health
Book by Juan M. Lavista Ferres and William B. Weeks: “…an insightful and fascinating discussion of how one of the world’s most recognizable software companies is tacking intractable social problems with the power of artificial intelligence (AI). In the book, you’ll learn about how climate change, illness and disease, and challenges to fundamental human rights are all being fought using replicable methods and reusable AI code.
The authors also provide:
- Easy-to-follow, non-technical explanations of what AI is and how it works
- Examinations of how healthcare is being improved, climate change is being addressed, and humanitarian aid is being facilitated around the world with AI
- Discussions of the future of AI in the realm of social benefit organizations and efforts
An essential guide to impactful social change with artificial intelligence, AI for Good is a must-read resource for technical and non-technical professionals interested in AI’s social potential, as well as policymakers, regulators, NGO professionals, and, and non-profit volunteers…(More)”.
Third Millennium Thinking: Creating Sense in a World of Nonsense
Book by Saul Perlmutter, John Campbell and Robert MacCoun: “In our deluge of information, it’s getting harder and harder to distinguish the revelatory from the contradictory. How do we make health decisions in the face of conflicting medical advice? Does the research cited in that article even show what the authors claim? How can we navigate the next Thanksgiving discussion with our in-laws, who follow completely different experts on the topic of climate change?
In Third Millennium Thinking, a physicist, a psychologist, and a philosopher introduce readers to the tools and frameworks that scientists have developed to keep from fooling themselves, to understand the world, and to make decisions. We can all borrow these trust-building techniques to tackle problems both big and small.
Readers will learn:
- How to achieve a ground-level understanding of the facts of the modern world
- How to chart a course through a profusion of possibilities
- How to work together to take on the challenges we face today
- And much more
Using provocative thought exercises, jargon-free language, and vivid illustrations drawn from history, daily life, and scientists’ insider stories, Third Millennium Thinking offers a novel approach for readers to make sense of the nonsense…(More)”
EBP+: Integrating science into policy evaluation using Evidential Pluralism
Article by Joe Jones, Alexandra Trofimov, Michael Wilde & Jon Williamson: “…While the need to integrate scientific evidence in policymaking is clear, there isn’t a universally accepted framework for doing so in practice. Orthodox evidence-based approaches take Randomised Controlled Trials (RCTs) as the gold standard of evidence. Others argue that social policy issues require theory-based methods to understand the complexities of policy interventions. These divisions may only further decrease trust in science at this critical time.
EBP+ offers a broader framework within which both orthodox and theory-based methods can sit. EBP+ also provides a systematic account of how to integrate and evaluate these different types of evidence. EBP+ can offer consistency and objectivity in policy evaluation, and could yield a unified approach that increases public trust in scientifically-informed policy…
EBP+ is motivated by Evidential Pluralism, a philosophical theory of causal enquiry that has been developed over the last 15 years. Evidential Pluralism encompasses two key claims. The first, object pluralism, says that establishing that A is a cause of B (e.g., that a policy intervention causes a specific outcome) requires establishing both that A and B are appropriately correlated and that there is some mechanism which links the two and which can account for the extent of the correlation. The second claim, study pluralism, maintains that assessing whether A is a cause of B requires assessing both association studies (studies that repeatedly measure A and B, together with potential confounders, to measure their association) and mechanistic studies (studies of features of the mechanisms linking A to B), where available…(More)”.
