Good process is vital for good government


Andrea Siodmok and Matthew Taylor at the RSA: “…‘Bad’ process is time wasting and energy sapping. It can reinforce barriers to collaboration, solidify hierarchies and hamper adaptiveness.

‘Good process’ energises people, creates spaces for different ideas to emerge, builds trust and collective capacity.

The bad and good could be distinguished along several dimensions. Here are some:

Bad process:

  • Routine/happens because it happens            
  • Limited preparation and follow through         
  • Little or no facilitation            
  • Reinforces hierarchies, excludes key voices  
  • Rigid accountability focussed on blame           
  • Always formal and mandated           
  • Low trust/transactional       

Good process:

  • Mission/goal oriented – happens because it makes a difference
  • Sees process as part of a flow of change – clear accountability
  • Facilitated by people with necessary skills and techniques 
  • Inclusive, what matters is the quality of contributions not their source
  • Collective accountability focussed on learning 
  • Mixes formal and informal settings and methods, often voluntary
  • Trust enhancing/collaborative

Why is bad process so prevalent and good process so rare?

Because bad process is often the default. In the short term, bad process is easier, less intensive-resource, and less risky than good process.

Bringing people together in inclusive processes

Bringing key actors together in inclusive processes help us both understand the system that is maintaining the status quo and building a joint sense of mission for a new status quo.

It also helps people start to identify and organise around key opportunities for change. 

One of the most positive developments to have occurred in and around Whitehall in recent years is the emergence of informal, system spanning networks of public officials animated by shared values and goals such as One Team Gov and a whole host of bottom up networks on topics as diverse as wellbeing, inclusion, and climate change….(More)”.

Facebook Ads as a Demographic Tool to Measure the Urban-Rural Divide


Paper by Daniele Rama, Yelena Mejova, Michele Tizzoni, Kyriaki Kalimeri, and Ingmar Weber: “In the global move toward urbanization, making sure the people remaining in rural areas are not left behind in terms of development and policy considerations is a priority for governments worldwide. However, it is increasingly challenging to track important statistics concerning this sparse, geographically dispersed population, resulting in a lack of reliable, up-to-date data. In this study, we examine the usefulness of the Facebook Advertising platform, which offers a digital “census” of over two billions of its users, in measuring potential rural-urban inequalities.

We focus on Italy, a country where about 30% of the population lives in rural areas. First, we show that the population statistics that Facebook produces suffer from instability across time and incomplete coverage of sparsely populated municipalities. To overcome such limitation, we propose an alternative methodology for estimating Facebook Ads audiences that nearly triples the coverage of the rural municipalities from 19% to 55% and makes feasible fine-grained sub-population analysis. Using official national census data, we evaluate our approach and confirm known significant urban-rural divides in terms of educational attainment and income. Extending the analysis to Facebook-specific user “interests” and behaviors, we provide further insights on the divide, for instance, finding that rural areas show a higher interest in gambling. Notably, we find that the most predictive features of income in rural areas differ from those for urban centres, suggesting researchers need to consider a broader range of attributes when examining rural wellbeing. The findings of this study illustrate the necessity of improving existing tools and methodologies to include under-represented populations in digital demographic studies — the failure to do so could result in misleading observations, conclusions, and most importantly, policies….(More)”.

Decide Madrid: A Critical Analysis of an Award-Winning e-Participation Initiative


Paper by Sonia Royo, Vicente Pina and Jaime Garcia-Rayado: “This paper analyzes the award-winning e-participation initiative of the city council of Madrid, Decide Madrid, to identify the critical success factors and the main barriers that are conditioning its performance. An exploratory case study is used as a research technique, including desk research and semi-structured interviews. The analysis distinguishes contextual, organizational and individual level factors; it considers whether the factors or barriers are more related to the information and communication technology (ICT) component, public sector context or democratic participation; it also differentiates among the different stages of the development of the initiative. Results show that individual and organizational factors related to the public sector context and democratic participation are the most relevant success factors.

The high expectations of citizens explain the high levels of participation in the initial stages of Decide Madrid. However, the lack of transparency and poor functioning of some of its participatory activities (organizational factors related to the ICT and democratic dimensions) are negatively affecting its performance. The software created for this platform, Consul, has been adopted or it is in the process of being implemented in more than 100 institutions in 33 countries. Therefore, the findings of this research can potentially be useful to improve the performance and sustainability of e-participation platforms worldwide…(More)”.

The Future of Democracy in Europe: Technology and the Evolution of Representation


Report by Chatham House: “There is a widespread sense that liberal democracy is in crisis, but little consensus exists on the specific nature and causes of the crisis. In particular, there are three prisms through which the crisis is usually seen: the rise of ‘populism’, ‘democratic deconsolidation’, and a ‘hollowing out’ of democracy. Each reflects normative assumptions about democracy.

The exact role of digital technology in the crisis is disputed. Despite the widely held perception that social media is undermining democracy, the evidence for this is limited. Over the longer term, the further development of digital technology could undermine the fundamental preconditions for democracy – though the pace and breadth of technological change make predictions about its future impact difficult.

Democracy functions in different ways in different European countries, with political systems on the continent ranging from ‘majoritarian democracies’ such as the UK to ‘consensual democracies’ such as Belgium and Switzerland. However, no type seems to be immune from the crisis. The political systems of EU member states also interact in diverse ways with the EU’s own structure, which is problematic for representative democracy as conventionally understood, but difficult to reform.

Political parties, central to the model of representative democracy that emerged in the late 18th century, have long seemed to be in decline. Recently there have been some signs of a reversal of this trend, with the emergence of parties that have used digital technology in innovative ways to reconnect with citizens. Traditional parties can learn from these new ‘digital parties’.

Recent years have also seen a proliferation of experiments in direct and deliberative democracy. There is a need for more experimentation in these alternative forms of democracy, and for further evaluation of how they can be integrated into the existing institutions and processes of representative democracy at the local, regional, national and EU levels.

We should not think of democracy in a static way – that is, as a system that can be perfected once and for all and then simply maintained and defended against threats. Democracy has continually evolved and now needs to evolve further. The solution to the crisis will not be to attempt to limit democracy in response to pressure from ‘populism’ but to deepen it further as part of a ‘democratization of democracy’….(More)”.

Algorithms and Contract Law


Paper by Lauren Henry Scholz: “Generalist confusion about the technology behind complex algorithms has led to inconsistent case law for algorithmic contracts. Case law explicitly grounded in the principle that algorithms are constructive agents for the companies they serve would provide a clear basis for enforceability of algorithmic contracts that is both principled from a technological perspective and is readily intelligible and able to be applied by generalists….(More)”.

The Economics of Maps


Abhishek Nagaraj and Scott Stern in the Journal of Economic Perspectives: “For centuries, maps have codified the extent of human geographic knowledge and shaped discovery and economic decision-making. Economists across many fields, including urban economics, public finance, political economy, and economic geography, have long employed maps, yet have largely abstracted away from exploring the economic determinants and consequences of maps as a subject of independent study. In this essay, we first review and unify recent literature in a variety of different fields that highlights the economic and social consequences of maps, along with an overview of the modern geospatial industry. We then outline our economic framework in which a given map is the result of economic choices around map data and designs, resulting in variations in private and social returns to mapmaking. We highlight five important economic and institutional factors shaping mapmakers’ data and design choices. Our essay ends by proposing that economists pay more attention to the endogeneity of mapmaking and the resulting consequences for economic and social welfare…(More)”.

Corporate Capitalism's Use of Openness: Profit for Free?


Book by Arwid Lund and Mariano Zukerfeld: “This book tackles the concept of openness (as in open source software, open access and free culture), from a critical political economy perspective to consider its encroachment by capitalist corporations, but also how it advances radical alternatives to cognitive capitalism.

Drawing on four case studies, Corporate Capitalism’s Use of Openness will add to discussion on open source software, open access content platforms, open access publishing, and open university courses. These otherwise disparate cases share two fundamental features: informational capitalist corporations base their successful business models on unpaid productive activities, play, attention, knowledge and labour, and do so crucially by resorting to ideological uses of concepts such as “openness”, “communities” and “sharing”.

The authors present potential solutions and alternative regulations to counter these exploitative and alienating business models, and to foster digital knowledge commons, ranging from co-ops and commons-based peer production to state agencies’ platforms. Their research and findings will appeal to students, academics and activists around the world in fields such as sociology, economy, media and communication, library and information science, political sciences and technology studies….(More)”.

Imagining Regulation Differently: Co-creating for Engagement


Book edited by Morag McDermont, Tim Cole, Janet Newman and Angela Piccini: “There is an urgent need to rethink relationships between systems of government and those who are ‘governed’. This book explores ways of rethinking those relationships by bringing communities normally excluded from decision-making to centre stage to experiment with new methods of regulating for engagement.

Using original, co-produced research, it innovatively shows how we can better use a ‘bottom-up’ approach to design regulatory regimes that recognise the capabilities of communities at the margins and powerfully support the knowledge, passions and creativity of citizens. The authors provide essential guidance for all those working on co-produced research to make impactful change…(More)”.

Automation in Moderation


Article by Hannah Bloch-Wehba: “This Article assesses recent efforts to compel or encourage online platforms to use automated means to prevent the dissemination of unlawful online content before it is ever seen or distributed. As lawmakers in Europe and around the world closely scrutinize platforms’ “content moderation” practices, automation and artificial intelligence appear increasingly attractive options for ridding the Internet of many kinds of harmful online content, including defamation, copyright infringement, and terrorist speech. Proponents of these initiatives suggest that requiring platforms to screen user content using automation will promote healthier online discourse and will aid efforts to limit Big Tech’s power.

In fact, however, the regulations that incentivize platforms to use automation in content moderation come with unappreciated costs for civil liberties and unexpected benefits for platforms. The new automation techniques exacerbate existing risks to free speech and user privacy and create ripe new sources of information for surveillance, aggravating threats to free expression, associational rights, religious freedoms, and equality. Automation also worsens transparency and accountability deficits. Far from curtailing private power, the new regulations endorse and expand platform authority to police online speech, with little in the way of oversight and few countervailing checks. New regulations of online intermediaries should therefore incorporate checks on the use of automation to avoid exacerbating these dynamics. Carefully drawn transparency obligations, algorithmic accountability mechanisms, and procedural safeguards can help to ameliorate the effects of these regulations on users and competition…(More)”.

Many Tech Experts Say Digital Disruption Will Hurt Democracy


Lee Rainie and Janna Anderson at Pew Research Center: “The years of almost unfettered enthusiasm about the benefits of the internet have been followed by a period of techlash as users worry about the actors who exploit the speed, reach and complexity of the internet for harmful purposes. Over the past four years – a time of the Brexit decision in the United Kingdom, the American presidential election and a variety of other elections – the digital disruption of democracy has been a leading concern.

The hunt for remedies is at an early stage. Resistance to American-based big tech firms is increasingly evident, and some tech pioneers have joined the chorus. Governments are actively investigating technology firms, and some tech firms themselves are requesting government regulation. Additionally, nonprofit organizations and foundations are directing resources toward finding the best strategies for coping with the harmful effects of disruption. For example, the Knight Foundation announced in 2019 that it is awarding $50 million in grants to encourage the development of a new field of research centered on technology’s impact on democracy.

In light of this furor, Pew Research Center and Elon University’s Imagining the Internet Center canvassed technology experts in the summer of 2019 to gain their insights about the potential future effects of people’s use of technology on democracy….

The main themes found in an analysis of the experts’ comments are outlined in the next two tables….(More)”.