Innovative Citizen Participation and New Democratic Institutions


Report by the OECD: “Public authorities from all levels of government increasingly turn to Citizens’ Assemblies, Juries, Panels and other representative deliberative processes to tackle complex policy problems ranging from climate change to infrastructure investment decisions. They convene groups of people representing a wide cross-section of society for at least one full day – and often much longer – to learn, deliberate, and develop collective recommendations that consider the complexities and compromises required for solving multifaceted public issues.

This “deliberative wave” has been building since the 1980s, gaining momentum since around 2010. This report has gathered close to 300 representative deliberative practices to explore trends in such processes, identify different models, and analyse the trade-offs among different design choices as well as the benefits and limits of public deliberation.

It includes Good Practice Principles for Deliberative Processes for Public Decision Making, based on comparative empirical evidence gathered by the OECD and in collaboration with leading practitioners from government, civil society, and academics. Finally, the report explores the reasons and routes for embedding deliberative activities into public institutions to give citizens a more permanent and meaningful role in shaping the policies affecting their lives….(More)”.

AI Procurement in a Box


Toolbox by the World Economic Forum: “AI Procurement in a Box is a practical guide that helps governments rethink the procurement of artificial intelligence (AI) with a focus on innovation, efficiency and ethics. Developing a new approach to the acquisition of emerging technologies such as AI will not only accelerate the adoption of AI in the administration, but also drive the development of ethical standards in AI development and deployment. Innovative procurement approaches have the potential to foster innovation, create competitive markets for AI systems and uphold public trust in the public-sector adoption of AI.

AI has the potential to vastly improve government operations and meet the needs of citizens in new ways, ranging from intelligently automating administrative processes to generating insights for public policy developments and improving public service delivery, for example, through personalized healthcare. Many public institutions are lagging behind in harnessing this powerful technology because of challenges related to data, skills and ethical deployment.

Public procurement can be an important driver of government adoption of AI. This means not only ensuring that AI-driven technologies offering the best value for money are purchased, but also driving the ethical development and deployment of innovative AI systems….(More)”.

EU Company Data: State of the Union 2020


Report by OpenCorporates: “… on access to company data in the EU. It’s completely revised, with more detail on the impact that the lack of access to this critical dataset has – on business, on innovation, on democracy, and society.

The results are still not great however:

  • Average score is low
    The average score across the EU in terms of access to company data is just 40 out of 100. This is better than the average score 8 years ago, which was just 23 out of 100, but still very low nevertheless.
  • Some major economies score badly
    Some of the EU’s major economies continue to score very badly indeed, with Germany, for example, scoring just 15/100, Italy 10/100, and Spain 0/100.
  • EU policies undermined
    The report identifies 15 areas where the lack of open company data frustrates, impedes or otherwise has a negative impact on EU policy.
  • Inequalities widened
    The report also identifies how inequalities are further widened by poor access to this critical dataset, and how the recovery from COVID-19 will be hampered by it too.

On the plus side, the report also identifies the EU Open Data & PSI Directive passed last year as potentially game changing – but only if it is implemented fully, and there are significant doubts whether this will happen….(More)”

Libraries Supporting Open Government: Areas for Engagement and Lessons Learned


Report by IFLA: “This report explores the roles libraries play in different countries’ Open Government Partnership Action Plans. Within the OGP framework, states and civil society actors work together to set out commitments for reforms, implement and review the impacts in recurring two-year cycles.

In different countries’ OGP commitments over the years, libraries and library associations assisted other agencies with the implementation of their commitments, or lead their own initiatives. Offering venues for civic engagement, helping develop tools and platforms for easier access to government records, providing valuable cultural Open Data and more – libraries can play a versatile role in supporting and enabling Open Government.

The report outlines the Open Government policy areas that libraries have been engaged in, the roles they took up to help deliver on OGP commitments, and some of the key ways to maximise the impact of library interventions, drawing on the lessons from earlier OGP cycles….(More)”.

Global collaboration on human migration launches digital hub


Press Release: “The International Organization for Migration (IOM) and the Joint Research Centre (JRC) of the European Commission joined forces with The Governance Lab (The GovLab) at the NYU Tandon School of Engineering to launch an online home for the Big Data for Migration (BD4M) Alliance, the first-ever global network dedicated to facilitating responsible data innovation and collaboration for informed decision making on migration and human mobility.

We live in a fast-moving world where a huge amount of data is being generated by the private sector but public-private data partnerships still remain limited. The BD4M, convened in 2018 by the European Commission’s Knowledge Centre on Migration and Demography (KCMD) and the IOM’s Global Migration Data Analysis Centre (GMDAC), seeks to foster more cooperation in this area by connecting stakeholders and leveraging non-traditional data sources to improve understanding.

The new BD4M web page, www.data4migration.org, hosted by the GovLab, serves as a hub for the Alliance’s activities. It aims to inform stakeholders about the BD4M members, its objectives, ongoing projects, upcoming events and opportunities for collaboration.

To facilitate access to knowledge about how data innovation has contributed to informing migration policy and programs, for example, the BD4M recently launched the Data Innovation Directory, which features examples of applications of new data sources and methodologies in the field of migration and human mobility.

The BD4M is open to members of international organizations, NGOs, the private sector, researchers and individual experts. In its partnership with The GovLab, the BD4M has helped identify a set of priority questions on migration that new data sources could contribute to answering. These questions were formulated by experts and validated through a public voting campaign as part of The 100 Questions Initiative….(More)”.

Centering Racial Equity Throughout Data Integration


Toolkit by AISP: “Societal “progress” is often marked by the construction of new infrastructure that fuels change and innovation. Just as railroads and interstate highways were the defining infrastructure projects of the 1800 and 1900s, the development of data infrastructure is a critical innovation of our century. Railroads and highways were drivers of development and prosperity for some investors and sites. Yet other individuals and communities were harmed, displaced, bypassed, ignored, and forgotten by
those efforts.

At this moment in our history, we can co-create data infrastructure to promote racial equity and the public good, or we can invest in data infrastructure that disregards the historical, social, and political context—reinforcing racial inequity that continues to harm communities. Building data infrastructure without a racial equity lens and understanding of historical context will exacerbate existing inequalities along the lines of race, gender, class, and ability. Instead, we commit to contextualize our work in the historical and structural oppression that shapes it, and organize stakeholders across geography, sector, and experience to center racial equity throughout data integration….(More)”.

How Crowdsourcing Aided a Push to Preserve the Histories of Nazi Victims


Andrew Curry at the New York Times: “With people around the globe sheltering at home amid the pandemic, an archive of records documenting Nazi atrocities asked for help indexing them. Thousands joined the effort….

As the virus prompted lockdowns across Europe, the director of the Arolsen Archives — the world’s largest devoted to the victims of Nazi persecution — joined millions of others working remotely from home and spending lots more time in front of her computer.

“We thought, ‘Here’s an opportunity,’” said the director, Floriane Azoulay.

Two months later, the archive’s “Every Name Counts” project has attracted thousands of online volunteers to work as amateur archivists, indexing names from the archive’s enormous collection of papers. To date, they have added over 120,000 names, birth dates and prisoner numbers in the database.

“There’s been much more interest than we expected,” Ms. Azoulay said. “The fact that people were locked at home and so many cultural offerings have moved online has played a big role.”

It’s a big job: The Arolsen Archives are the largest collection of their kind in the world, with more than 30 million original documents. They contain information on the wartime experiences of as many as 40 million people, including Jews executed in extermination camps and forced laborers conscripted from across Nazi-occupied Europe.

The documents, which take up 16 miles of shelving, include things like train manifests, delousing records, work detail assignments and execution records…(More)”.

Digital contact tracing and surveillance during COVID-19


Report on General and Child-specific Ethical Issues by Gabrielle Berman, Karen Carter, Manuel García-Herranz and Vedran Sekara: “The last few years have seen a proliferation of means and approaches being used to collect sensitive or identifiable data on children. Technologies such as facial recognition and other biometrics, increased processing capacity for ‘big data’ analysis and data linkage, and the roll-out of mobile and internet services and access have substantially changed the nature of data collection, analysis, and use.

Real-time data are essential to support decision-makers in government, development and humanitarian agencies such as UNICEF to better understand the issues facing children, plan appropriate action, monitor progress and ensure that no one is left behind. But the collation and use of personally identifiable data may also pose significant risks to children’s rights.

UNICEF has undertaken substantial work to provide a foundation to understand and balance the potential benefits and risks to children of data collection. This work includes the Industry Toolkit on Children’s Online Privacy and Freedom of Expression and a partnership with GovLab on Responsible Data for Children (RD4C) – which promotes good practice principles and has developed practical tools to assist field offices, partners and governments to make responsible data management decisions.

Balancing the need to collect data to support good decision-making versus the need to protect children from harm created through the collection of the data has never been more challenging than in the context of the global COVID-19 pandemic. The response to the pandemic has seen an unprecedented rapid scaling up of technologies to support digital contact tracing and surveillance. The initial approach has included:

  • tracking using mobile phones and other digital devices (tablet computers, the Internet of Things, etc.)
  • surveillance to support movement restrictions, including through the use of location monitoring and facial recognition
  • a shift from in-person service provision and routine data collection to the use of remote or online platforms (including new processes for identity verification)
  • an increased focus on big data analysis and predictive modelling to fill data gaps…(More)”.

The Food Systems Dashboard


About: “The Food Systems Dashboard combines data from multiple sources to give users a complete view of food systems. Users can compare components of food systems across countries and regions. They can also identify and prioritize ways to sustainably improve diets and nutrition in their food systems.

Dashboards are useful tools that help users visualize and understand key information for complex systems. Users can track progress to see if policies or other interventions are working at a country or regional level

In recent years, the public health and nutrition communities have used dashboards to track the progress of health goals and interventions, including the Sustainable Development Goals. To our knowledge, this is the first dashboard that collects country-level data across all components of the food system.

The Dashboard contains over 150 indicators that measure components, drivers, and outcomes of food systems at the country level. As new indicators and data become available, the Dashboard will be updated. Most data used for the Dashboard is open source and available to download directly from the website. Data is pooled from FAO, Euromonitor International, World Bank, and other global and regional data sources….(More)”.

Policy Priority Inference


Turing Institute: “…Policy Priority Inference builds on a behavioural computational model, taking into account the learning process of public officials, coordination problems, incomplete information, and imperfect governmental monitoring mechanisms. The approach is a unique mix of economic theory, behavioural economics, network science and agent-based modelling. The data that feeds the model for a specific country (or a sub-national unit, such as a state) includes measures of the country’s DIs and how they have moved over the years, specified government policy goals in relation to DIs, the quality of government monitoring of expenditure, and the quality of the country’s rule of law.

From these data alone – and, crucially, with no specific information on government expenditure, which is rarely made available – the model can infer the transformative resources a country has historically allocated to transform its SDGs, and assess the importance of SDG interlinkages between DIs. Importantly, it can also reveal where previously hidden inefficiencies lie.

How does it work? The researchers modelled the socioeconomic mechanisms of the policy-making process using agent-computing simulation. They created a simulator featuring an agent called “Government”, which makes decisions about how to allocate public expenditure, and agents called “Bureaucrats”, each of which is essentially a policy-maker linked to a single DI. If a Bureaucrat is allocated some resource, they will use a portion of it to improve their DI, with the rest lost to some degree of inefficiency (in reality, inefficiencies range from simple corruption to poor quality policies and inefficient government departments).

How much resource a Bureaucrat puts towards moving their DI depends on that agent’s experience: if becoming inefficient pays off, they’ll keep doing it. During the process, Government monitors the Bureaucrats, occasionally punishing inefficient ones, who may then improve their behaviour. In the model, a Bureaucrat’s chances of getting caught is linked to the quality of a government’s real-world monitoring of expenditure, and the extent to which they are punished is reflected in the strength of that country’s rule of law.

Diagram of the Policy Priority Inference model
Using data on a country or state’s development indicators and its governance, Policy Priority Inference techniques can model how a government and its policy-makers allocate “transformational resources” to reach their sustainable development goals.

When the historical movements of a country’s DIs are reproduced through the internal workings of the model, the researchers have a powerful proxy for the real-world relationships between government activity, the movement of DIs, and the effects of the interlinkages between DIs, all of which are unique to that country. “Once we can match outcomes, we can discern something that’s going on in reality. But the fact that the method is matching the dynamics of real-world development indicators is just one of multiple ways that we validate our results,” Guerrero notes. This proxy can then be used to project which policy areas should be prioritised in future to best achieve the government’s specified development goals, including predictions of likely timescales.

What’s more, in combination with techniques from evolutionary computation, the model can identify DIs that are linked to large positive spillover effects. These DIs are dubbed “accelerators”. Targeting government resources at such development accelerators fosters not only more rapid results, but also more generalised development…(More)”.