Good process is vital for good government


Andrea Siodmok and Matthew Taylor at the RSA: “…‘Bad’ process is time wasting and energy sapping. It can reinforce barriers to collaboration, solidify hierarchies and hamper adaptiveness.

‘Good process’ energises people, creates spaces for different ideas to emerge, builds trust and collective capacity.

The bad and good could be distinguished along several dimensions. Here are some:

Bad process:

  • Routine/happens because it happens            
  • Limited preparation and follow through         
  • Little or no facilitation            
  • Reinforces hierarchies, excludes key voices  
  • Rigid accountability focussed on blame           
  • Always formal and mandated           
  • Low trust/transactional       

Good process:

  • Mission/goal oriented – happens because it makes a difference
  • Sees process as part of a flow of change – clear accountability
  • Facilitated by people with necessary skills and techniques 
  • Inclusive, what matters is the quality of contributions not their source
  • Collective accountability focussed on learning 
  • Mixes formal and informal settings and methods, often voluntary
  • Trust enhancing/collaborative

Why is bad process so prevalent and good process so rare?

Because bad process is often the default. In the short term, bad process is easier, less intensive-resource, and less risky than good process.

Bringing people together in inclusive processes

Bringing key actors together in inclusive processes help us both understand the system that is maintaining the status quo and building a joint sense of mission for a new status quo.

It also helps people start to identify and organise around key opportunities for change. 

One of the most positive developments to have occurred in and around Whitehall in recent years is the emergence of informal, system spanning networks of public officials animated by shared values and goals such as One Team Gov and a whole host of bottom up networks on topics as diverse as wellbeing, inclusion, and climate change….(More)”.

Eurobarometer survey shows support for sustainability and data sharing


Press Release: “Europeans want their digital devices to be easier to repair or recycle and are willing to share their personal information to improve public services, as a special Eurobarometer survey shows. The survey, released today, measured attitudes towards the impact of digitalisation on daily lives of Europeans in 27 EU Member States and the United Kingdom. It covers several different areas including digitalisation and the environment, sharing personal information, disinformation, digital skills and the use of digital ID….

Overall, 59% of respondents would be willing to share some of their personal information securely to improve public services. In particular, most respondents are willing to share their data to improve medical research and care (42%), to improve the response to crisis (31%) or to improve public transport and reduce air pollution (26%).

An overwhelming majority of respondents who use their social media accounts to log in to other online services (74%) want to know how their data is used. A large majority would consider it useful to have a secure single digital ID that could serve for all online services and give them control over the use of their data….

In addition to the Special Eurobarometer report, the last iteration of the Standard Eurobarometer conducted in November 2019 also tested public perceptions related to Artificial Intelligence. The findings also published in a separate report today.

Around half of the respondents (51%) said that public policy intervention is needed to ensure ethical applications. Half of the respondents (50%) mention the healthcare sector as the area where AI could be most beneficial. A strong majority (80%) of the respondents think that they should be informed when a digital service or mobile application uses AI in various situations….(More)”.

Facebook Ads as a Demographic Tool to Measure the Urban-Rural Divide


Paper by Daniele Rama, Yelena Mejova, Michele Tizzoni, Kyriaki Kalimeri, and Ingmar Weber: “In the global move toward urbanization, making sure the people remaining in rural areas are not left behind in terms of development and policy considerations is a priority for governments worldwide. However, it is increasingly challenging to track important statistics concerning this sparse, geographically dispersed population, resulting in a lack of reliable, up-to-date data. In this study, we examine the usefulness of the Facebook Advertising platform, which offers a digital “census” of over two billions of its users, in measuring potential rural-urban inequalities.

We focus on Italy, a country where about 30% of the population lives in rural areas. First, we show that the population statistics that Facebook produces suffer from instability across time and incomplete coverage of sparsely populated municipalities. To overcome such limitation, we propose an alternative methodology for estimating Facebook Ads audiences that nearly triples the coverage of the rural municipalities from 19% to 55% and makes feasible fine-grained sub-population analysis. Using official national census data, we evaluate our approach and confirm known significant urban-rural divides in terms of educational attainment and income. Extending the analysis to Facebook-specific user “interests” and behaviors, we provide further insights on the divide, for instance, finding that rural areas show a higher interest in gambling. Notably, we find that the most predictive features of income in rural areas differ from those for urban centres, suggesting researchers need to consider a broader range of attributes when examining rural wellbeing. The findings of this study illustrate the necessity of improving existing tools and methodologies to include under-represented populations in digital demographic studies — the failure to do so could result in misleading observations, conclusions, and most importantly, policies….(More)”.

Decide Madrid: A Critical Analysis of an Award-Winning e-Participation Initiative


Paper by Sonia Royo, Vicente Pina and Jaime Garcia-Rayado: “This paper analyzes the award-winning e-participation initiative of the city council of Madrid, Decide Madrid, to identify the critical success factors and the main barriers that are conditioning its performance. An exploratory case study is used as a research technique, including desk research and semi-structured interviews. The analysis distinguishes contextual, organizational and individual level factors; it considers whether the factors or barriers are more related to the information and communication technology (ICT) component, public sector context or democratic participation; it also differentiates among the different stages of the development of the initiative. Results show that individual and organizational factors related to the public sector context and democratic participation are the most relevant success factors.

The high expectations of citizens explain the high levels of participation in the initial stages of Decide Madrid. However, the lack of transparency and poor functioning of some of its participatory activities (organizational factors related to the ICT and democratic dimensions) are negatively affecting its performance. The software created for this platform, Consul, has been adopted or it is in the process of being implemented in more than 100 institutions in 33 countries. Therefore, the findings of this research can potentially be useful to improve the performance and sustainability of e-participation platforms worldwide…(More)”.

The Future of Democracy in Europe: Technology and the Evolution of Representation


Report by Chatham House: “There is a widespread sense that liberal democracy is in crisis, but little consensus exists on the specific nature and causes of the crisis. In particular, there are three prisms through which the crisis is usually seen: the rise of ‘populism’, ‘democratic deconsolidation’, and a ‘hollowing out’ of democracy. Each reflects normative assumptions about democracy.

The exact role of digital technology in the crisis is disputed. Despite the widely held perception that social media is undermining democracy, the evidence for this is limited. Over the longer term, the further development of digital technology could undermine the fundamental preconditions for democracy – though the pace and breadth of technological change make predictions about its future impact difficult.

Democracy functions in different ways in different European countries, with political systems on the continent ranging from ‘majoritarian democracies’ such as the UK to ‘consensual democracies’ such as Belgium and Switzerland. However, no type seems to be immune from the crisis. The political systems of EU member states also interact in diverse ways with the EU’s own structure, which is problematic for representative democracy as conventionally understood, but difficult to reform.

Political parties, central to the model of representative democracy that emerged in the late 18th century, have long seemed to be in decline. Recently there have been some signs of a reversal of this trend, with the emergence of parties that have used digital technology in innovative ways to reconnect with citizens. Traditional parties can learn from these new ‘digital parties’.

Recent years have also seen a proliferation of experiments in direct and deliberative democracy. There is a need for more experimentation in these alternative forms of democracy, and for further evaluation of how they can be integrated into the existing institutions and processes of representative democracy at the local, regional, national and EU levels.

We should not think of democracy in a static way – that is, as a system that can be perfected once and for all and then simply maintained and defended against threats. Democracy has continually evolved and now needs to evolve further. The solution to the crisis will not be to attempt to limit democracy in response to pressure from ‘populism’ but to deepen it further as part of a ‘democratization of democracy’….(More)”.

Car Data Facts


About: “Welcome to CarDataFacts.eu! This website provides a fact-based overview on everything related to the sharing of vehicle-generated data with third parties. Through a series of educational infographics, this website answers the most common questions about access to car data in a clear and simple way.

CarDataFacts.eu also addresses consumer concerns about sharing data in a safe and a secure way, as well as explaining some of the complex and technical terminology surrounding the debate.

CarDataFacts.eu is brought to you by ACEA, the European Automobile Manufacturers’ Association, which represents the 15 Europe-based car, van, truck and bus makers….(More)”.

Imagining Regulation Differently: Co-creating for Engagement


Book edited by Morag McDermont, Tim Cole, Janet Newman and Angela Piccini: “There is an urgent need to rethink relationships between systems of government and those who are ‘governed’. This book explores ways of rethinking those relationships by bringing communities normally excluded from decision-making to centre stage to experiment with new methods of regulating for engagement.

Using original, co-produced research, it innovatively shows how we can better use a ‘bottom-up’ approach to design regulatory regimes that recognise the capabilities of communities at the margins and powerfully support the knowledge, passions and creativity of citizens. The authors provide essential guidance for all those working on co-produced research to make impactful change…(More)”.

Facial Recognition Software requires Checks and Balances


David Eaves,  and Naeha Rashid in Policy Options: “A few weeks ago, members of the Nexus traveller identification program were notified that Canadian Border Services is upgrading its automated system, from iris scanners to facial recognition technology. This is meant to simplify identification and increase efficiency without compromising security. But it also raises profound questions concerning how we discuss and develop public policies around such technology – questions that may not be receiving sufficiently open debate in the rush toward promised greater security.

Analogous to the U.S. Customs and Border Protection (CBP) program Global Entry, Nexus is a joint Canada-US border control system designed for low-risk, pre-approved travellers. Nexus does provide a public good, and there are valid reasons to improve surveillance at airports. Even before 9/11, border surveillance was an accepted annoyance and since then, checkpoint operations have become more vigilant and complex in response to the public demand for safety.

Nexus is one of the first North America government-sponsored services to adopt facial recognition, and as such it could be a pilot program that other services will follow. Left unchecked, the technology will likely become ubiquitous at North American border crossings within the next decade, and it will probably be adopted by governments to solve domestic policy challenges.

Facial recognition software is imperfect and has documented bias, but it will continue to improve and become superior to humans in identifying individuals. Given this, questions arise such as, what policies guide the use of this technology? What policies should inform future government use? In our headlong rush toward enhanced security, we risk replicating the justification the used by the private sector in an attempt to balance effectiveness, efficiency and privacy.

One key question involves citizens’ capacity to consent. Previously, Nexus members submitted to fingerprint and retinal scans – biometric markers that are relatively unique and enable government to verify identity at the border. Facial recognition technology uses visual data and seeks, analyzes, and stores identifying facial information in a database, which is then used to compare with new images and video….(More)”.

Tesco Grocery 1.0, a large-scale dataset of grocery purchases in London


Paper by Luca Maria Aiello, Daniele Quercia, Rossano Schifanella & Lucia Del Prete: “We present the Tesco Grocery 1.0 dataset: a record of 420 M food items purchased by 1.6 M fidelity card owners who shopped at the 411 Tesco stores in Greater London over the course of the entire year of 2015, aggregated at the level of census areas to preserve anonymity. For each area, we report the number of transactions and nutritional properties of the typical food item bought including the average caloric intake and the composition of nutrients.

The set of global trade international numbers (barcodes) for each food type is also included. To establish data validity we: i) compare food purchase volumes to population from census to assess representativeness, and ii) match nutrient and energy intake to official statistics of food-related illnesses to appraise the extent to which the dataset is ecologically valid. Given its unprecedented scale and geographic granularity, the data can be used to link food purchases to a number of geographically-salient indicators, which enables studies on health outcomes, cultural aspects, and economic factors….(More)”.

The Economic Impact of Open Data: Opportunities for value creation in Europe


Press Release: “The European Data Portal publishes its study “The Economic Impact of Open Data: Opportunities for value creation in Europe”. It researches the value created by open data in Europe. It is the second study by the European Data Portal, following the 2015 report. The open data market size is estimated at €184 billion and forecast to reach between €199.51 and €334.21 billion in 2025. The report additionally considers how this market size is distributed along different sectors and how many people are employed due to open data. The efficiency gains from open data, such as potential lives saved, time saved, environmental benefits, and improvement of language services, as well as associated potential costs savings are explored and quantified where possible. Finally, the report also considers examples and insights from open data re-use in organisations. The key findings of the report are summarised below:

  1. The specification and implementation of high-value datasets as part of the new Open Data Directive is a promising opportunity to address quality & quantity demands of open data.
  2. Addressing quality & quantity demands is important, yet not enough to reach the full potential of open data.
  3. Open data re-users have to be aware and capable of understanding and leveraging the potential.
  4. Open data value creation is part of the wider challenge of skill and process transformation: a lengthy process whose change and impact are not always easy to observe and measure.
  5. Sector-specific initiatives and collaboration in and across private and public sector foster value creation.
  6. Combining open data with personal, shared, or crowdsourced data is vital for the realisation of further growth of the open data market.
  7. For different challenges, we must explore and improve multiple approaches of data re-use that are ethical, sustainable, and fit-for-purpose….(More)”.