The Haves and the Have Nots: Civic Technologies and the Pathways to Government Responsiveness


Paper by Jonathan Mellon, Tiago C. Peixoto and Fredrik M. Sjoberg: “As civic life has moved online scholars have questioned whether this will exacerbate political inequalities due to differences in access to technology. However, this concern typically assumes that unequal participation inevitably leads to unequal outcomes: if online participants are unrepresentative of the population, then participation outcomes will benefit groups who participate and disadvantage those who do not. This paper combines the results from eight previous studies on civic technology platforms. It conducts new analysis to trace inequality throughout the participation chain, from (1) the existing digital divide, to (2) the profile of participants, to (3) the types of demands made through the platform, and, finally, to (4) policy outcomes.
The paper examines four civic technology models: online voting for participatory budgeting in Brazil, online local problem reporting in the United Kingdom, crowdsourced constitution drafting in Iceland, and online petitioning across 132 countries. In every case, the assumed links in the participation chain broke down because of the platform’s institutional features and the surrounding political process.
These results show that understanding how inequality is created requires examination of all stages of participation, as well as the resulting policy response. The assumption that inequalities in participation will always lead to the same inequalities in outcomes is not borne out in practice…(More)”.

Citizens’ Assemblies Are Upgrading Democracy: Fair Algorithms Are Part of the Program


Essay by Ariel Procaccia: “In 1983 the Eighth Amendment to the Irish constitution enshrined an abortion ban that had prevailed in the nation for more than a century. Public opinion on the issue shifted in the new millennium, however, and by 2016 it was clear that a real debate could no longer be avoided. But even relatively progressive politicians had long steered clear of the controversy rather than risk alienating voters. Who would be trustworthy and persuasive enough to break the deadlock?

The answer was a bunch of ordinary people. Seriously. The Irish Parliament convened a citizens’ assembly, whose 99 members were chosen at random. The selection process ensured that the group’s composition represented the Irish population along dimensions such as age, gender and geography. Over several months in 2016 and 2017, the assembly heard expert opinions and held extensive discussions regarding the legalization of abortion. Its recommendation, supported by a significant majority of members, was to allow abortions in all circumstances, subject to limits on the length of pregnancy. These conclusions set the stage for a 2018 referendum in which 66 percent of Ireland’s voters chose to repeal the Eighth Amendment, enabling abortion to be legalized. Such an outcome had been almost inconceivable a few years earlier.

The Irish citizens’ assembly is just one example of a widespread phenomenon. In recent years hundreds of such groups have convened around the world, their members randomly selected from the concerned population and given time and information to aid their deliberations. Citizens’ assemblies in France, Germany, the U.K., Washington State and elsewhere have charted pathways for reducing carbon emissions. An assembly in Canada sought methods of mitigating hate speech and fake news; another in Australia recommended ethical approaches to human genome editing; and yet another in Oregon identified policies for COVID pandemic recovery. Taken together, these assemblies have demonstrated an impressive capacity to uncover the will of the people and build consensus.

The effectiveness of citizens’ assemblies isn’t surprising. Have you ever noticed how politicians grow a spine the moment they decide not to run for reelection? Well, a citizens’ assembly is a bit like a legislature whose members make a pact barring them from seeking another term in office. The randomly selected members are not beholden to party machinations or outside interests; they are free to speak their mind and vote their conscience…(More)”.

Addressing ethical gaps in ‘Technology for Good’: Foregrounding care and capabilities


Paper by Alison B. Powell et al: “This paper identifies and addresses persistent gaps in the consideration of ethical practice in ‘technology for good’ development contexts. Its main contribution is to model an integrative approach using multiple ethical frameworks to analyse and understand the everyday nature of ethical practice, including in professional practice among ‘technology for good’ start-ups. The paper identifies inherent paradoxes in the ‘technology for good’ sector as well as ethical gaps related to (1) the sometimes-misplaced assignment of virtuousness to an individual; (2) difficulties in understanding social constraints on ethical action; and (3) the often unaccounted for mismatch between ethical intentions and outcomes in everyday practice, including in professional work associated with an ‘ethical turn’ in technology. These gaps persist even in contexts where ethics are foregrounded as matters of concern. To address the gaps, the paper suggests systemic, rather than individualized, considerations of care and capability applied to innovation settings, in combination with considerations of virtue and consequence. This paper advocates for addressing these challenges holistically in order to generate renewed capacity for change at a systemic level…(More)”.

Leveraging Data for the Public Good


Article by Christopher Pissarides, Fadi Farra and Amira Bensebaa: “…Yet data are simply too important to be entrusted to either governments or large corporations that treat them as their private property. Instead, governments should collaborate with companies on joint-governance frameworks that recognize both the opportunities and the risks of big data.

Businesses – which are best positioned to understand big data’s true value – must move beyond short‐sighted efforts to prevent regulation. Instead, they need to initiate a dialogue with policymakers on how to design viable solutions that can leverage the currency of our era to benefit the public good. Doing so would help them regain public trust.

Governments, for their part, must avoid top‐down regulatory strategies. To win the support they need from businesses, they need to create incentives for data sharing and privacy protection and help develop new analytical tools through advanced modeling. Governments should also rethink and renew deeply-rooted frameworks inherited from the industrial era, such as those for taxation and social welfare.

In the digital age, governments should recognize the centrality of data to policymaking and develop tools to reward businesses that contribute to the public good by sharing it. True, governments require taxes to raise revenues, but they must recognize that a better understanding of individuals enables more efficient policies. By recognizing companies’ ability to save public money and create social value, governments could encourage companies to share data as a matter of social responsibility…(More)”.

CNSTAT Report Emphasizes the Need for a National Data Infrastructure


Article by Molly Gahagen: “Having credible and accessible data is essential for various sectors of society to function. In the recent report, “Toward a 21st Century National Data Infrastructure: Mobilizing Information for the Common Good,” by the Committee on National Statistics (CNSTAT) of the National Academies of Sciences, Engineering, and Medicine, the importance of national data infrastructure is emphasized…

Emphasizing the need for reliable statistics for national, state and local government officials, as well as businesses and citizens, the report cites the need for a modern national data infrastructure incorporating data from multiple federal agencies. Initial recommendations and potential outcomes of such a system are contained in the report.

Recommendations include practices to incorporate data from many sources, safeguard privacy, freely share statistics with the public, ensure transparency and create a modern system that would allow for easy access and enhanced security.

Potential outcomes of this infrastructure highlighted by the authors of the report include increased evidence-based policymaking on several levels of government, uniform regulations for data reporting and users accessing the data and increased security. The report describes how this would tie into increased initiatives to promote research and evidence-based policymaking, including through the passing of the Evidence-Based Policymaking Act of 2018 in Congress.

CNSTAT’s future reports seek to address blending multiple data sources, data equity, technology and tools, among other topics…(More)”.

Eliminate data asymmetries to democratize data use


Article by Rahul Matthan: “Anyone who possesses a large enough store of data can reasonably expect to glean powerful insights from it. These insights are more often than not used to enhance advertising revenues or ensure greater customer stickiness. In other instances, they’ve been subverted to alter our political preferences and manipulate us into taking decisions we otherwise may not have.

The ability to generate insights places those who have access to these data sets at a distinct advantage over those whose data is contained within them. It allows the former to benefit from the data in ways that the latter may not even have thought possible when they consented to provide it. Given how easily these insights can be used to harm those to whom it pertains, there is a need to mitigate the effects of this data asymmetry.

Privacy law attempts to do this by providing data principals with tools they can use to exert control over their personal data. It requires data collectors to obtain informed consent from data principals before collecting their data and forbids them from using it for any purpose other than that which has been previously notified. This is why, even if that consent has been obtained, data fiduciaries cannot collect more data than is absolutely necessary to achieve the stated purpose and are only allowed to retain that data for as long as is necessary to fulfil the stated purpose.

In India, we’ve gone one step further and built techno-legal solutions to help reduce this data asymmetry. The Data Empowerment and Protection Architecture (DEPA) framework makes it possible to extract data from the silos in which they reside and transfer it on the instructions of the data principal to other entities, which can then use it to provide other services to the data principal. This data micro-portability dilutes the historical advantage that incumbents enjoy on account of collecting data over the entire duration of their customer engagement. It eliminates data asymmetries by establishing the infrastructure that creates a competitive market for data-based services, allowing data principals to choose from a range of options as to how their data could be used for their benefit by service providers.

This, however, is not the only type of asymmetry we have to deal with in this age of big data. In a recent article, Stefaan Verhulst of GovLab at New York University pointed out that it is no longer enough to possess large stores of data—you need to know how to effectively extract value from it. Many businesses might have vast stores of data that they have accumulated over the years they have been in operation, but very few of them are able to effectively extract useful signals from that noisy data.

Without the know-how to translate data into actionable information, merely owning a large data set is of little value.

Unlike data asymmetries, which can be mitigated by making data more widely available, information asymmetries can only be addressed by radically democratizing the techniques and know-how that are necessary for extracting value from data. This know-how is largely proprietary and hard to access even in a fully competitive market. What’s more, in many instances, the computation power required far exceeds the capacity of entities for whom data analysis is not the main purpose of their business…(More)”.

Cutting through complexity using collective intelligence


Blog by the UK Policy Lab: “In November 2021 we established a Collective Intelligence Lab (CILab), with the aim of improving policy outcomes by tapping into collective intelligence (CI). We define CI as the diversity of thought and experience that is distributed across groups of people, from public servants and domain experts to members of the public. We have been experimenting with a digital tool, Pol.is, to capture diverse perspectives and new ideas on key government priority areas. To date we have run eight debates on issues as diverse as Civil Service modernisation, fisheries management and national security. Across these debates over 2400 civil servants, subject matter experts and members of the public have participated…

From our experience using CILab on live policy issues, we have identified a series of policy use cases that echo findings from the government of Taiwan and organisations such as Nesta. These use cases include: 1) stress-testing existing policies and current thinking, 2) drawing out consensus and divergence on complex, contentious issues, and 3) identifying novel policy ideas

1) Stress-testing existing policy and current thinking

CI could be used to gauge expert and public sentiment towards existing policy ideas by asking participants to discuss existing policies and current thinking on Pol.is. This is well suited to testing public and expert opinions on current policy proposals, especially where their success depends on securing buy-in and action from stakeholders. It can also help collate views and identify barriers to effective implementation of existing policy.

From the initial set of eight CILab policy debates, we have learnt that it is sometimes useful to design a ‘crossover point’ into the process. This is where part way through a debate, statements submitted by policymakers, subject matter experts and members of the public can be shown to each other, in a bid to break down groupthink across those groups. We used this approach in a Pol.is debate on a topic relating to UK foreign policy, and think it could help test how existing policies on complex areas such as climate change or social care are perceived within and outside government…(More)”

Digital Government: Strategy, Government Models and Technology


Text book by Bernd W. Wirtz: “Digitization, the global networking of individuals and organizations, and the transition from an industrial to an information society are key reasons for the importance of digital government. In particular, the enormous influence of the Internet as a global networking and communication system affects the performance of public services.

This textbook introduces the concept of digital government as well as digital management and provides helpful insights and strategic advice for the successful implementation and maintenance of digital government systems…(More)”.

“Can AI bring deliberative democracy to the masses?”


Paper by Hélène Landemore: “A core problem in deliberative democracy is the tension between two seemingly equally important conditions of democratic legitimacy: deliberation on the one hand and mass participation on the other. Might artificial intelligence help bring quality deliberation to the masses? The paper first examines the conundrum in deliberative democracy around the tradeoff between deliberation and mass participation by returning to the seminal debate between Joshua Cohen and Jürgen Habermas about the proper model of deliberative democracy. It then turns to an analysis of the 2019 French Great National Debate, a low-tech attempt to involve millions of French citizens in a structured exercise of collective deliberation over a two-month period. Building on the shortcomings of this empirical attempt, the paper then considers two different visions for an algorithm-powered scaled-up form of mass deliberation—Mass Online Deliberation on the one hand and a multiplicity of rotating randomly selected mini-publics on the other—theorizing various ways Artificial Intelligence could play a role in either of them…(More)”.

The Participation Paradox


Book by  Luke Sinwell: “The last two decades have ushered in what has become known as a participatory revolution, with consultants, advisors, and non-profits called into communities, classrooms, and corporations alike to listen to ordinary people. With exclusively bureaucratic approaches no longer en vogue, authorities now opt for “open” forums for engagement.

In The Participation Paradox Luke Sinwell argues that amplifying the voices of the poor and dispossessed is often a quick fix incapable of delivering concrete and lasting change. The ideology of public consultation and grassroots democracy can be a smokescreen for a cost-effective means by which to implement top-down decisions. As participation has become mainstreamed by governments around the world, so have its radical roots become tamed by neoliberal forces that reinforce existing relationships of power. Drawing from oral testimonies and ethnographic research, Sinwell presents a case study of one of the poorest and most defiant Black informal settlements in Johannesburg, South Africa – Thembelihle, which consists of more than twenty thousand residents – highlighting the promises and pitfalls of participatory approaches to development.

Providing a critical lens for understanding grassroots democracy, The Participation Paradox foregrounds alternatives capable of reclaiming participation’s emancipatory potential…(More)”.