Governing the Environment-Related Data Space


Stefaan G. Verhulst, Anthony Zacharzewski and Christian Hudson at Data & Policy: “Today, The GovLab and The Democratic Society published their report, “Governing the Environment-Related Data Space”, written by Jörn Fritzenkötter, Laura Hohoff, Paola Pierri, Stefaan G. Verhulst, Andrew Young, and Anthony Zacharzewski . The report captures the findings of their joint research centered on the responsible and effective reuse of environment-related data to achieve greater social and environmental impact.

Environment-related data (ERD) encompasses numerous kinds of data across a wide range of sectors. It can best be defined as data related to any element of the Driver-Pressure-State-Impact-Response (DPSIR) Framework. If leveraged effectively, this wealth of data could help society establish a sustainable economy, take action against climate change, and support environmental justice — as recognized recently by French President Emmanuel Macron and UN Secretary General’s Special Envoy for Climate Ambition and Solutions Michael R. Bloomberg when establishing the Climate Data Steering Committee.

While several actors are working to improve access to, as well as promote the (re)use of, ERD data, two key challenges that hamper progress on this front are data asymmetries and data enclosures. Data asymmetries occur due to the ever-increasing amounts of ERD scattered across diverse actors, with larger and more powerful stakeholders often maintaining unequal access. Asymmetries lead to problems with accessibility and findability (data enclosures), leading to limited sharing and collaboration, and stunting the ability to use data and maximize its potential to address public ills.

The risks and costs of data enclosure and data asymmetries are high. Information bottlenecks cause resources to be misallocated, slow scientific progress, and limit our understanding of the environment.

A fit-for-purpose governance framework could offer a solution to these barriers by creating space for more systematic, sustainable, and responsible data sharing and collaboration. Better data sharing can in turn ease information flows, mitigate asymmetries, and minimize data enclosures.

And there are some clear criteria for an effective governance framework…(More)”

AI & Cities: Risks, Applications and Governance


Report by UN Habitat: “Artificial intelligence is manifesting at an unprecedented rate in urban centers, often with significant risks and little oversight. Using AI technologies without the appropriate governance mechanisms and without adequate consideration of how they affect people’s human rights can have negative, even catastrophic, effects.

This report is part of UN-Habitat’s strategy for guiding local authorities in realizing a people-centered digital transformation process in their cities and settlements…(More)”.

Blueprint for an AI Bill of Rights


The White House: “…To advance President Biden’s vision, the White House Office of Science and Technology Policy has identified five principles that should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. The Blueprint for an AI Bill of Rights is a guide for a society that protects all people from these threats—and uses technologies in ways that reinforce our highest values. Responding to the experiences of the American public, and informed by insights from researchers, technologists, advocates, journalists, and policymakers, this framework is accompanied by From Principles to Practice—a handbook for anyone seeking to incorporate these protections into policy and practice, including detailed steps toward actualizing these principles in the technological design process. These principles help provide guidance whenever automated systems can meaningfully impact the public’s rights, opportunities, or access to critical needs.

  • Safe and Effective Systems
  • Data Privacy
  • Notice and Explanation
  • Algorithmic Discrimination Protections
  • Human Alternatives, Consideration, and Fallback…(More)”.

Policy evaluation in times of crisis: key issues and the way forward


OECD Paper: “This paper provides an overview of the challenges policy evaluators faced in the context of COVID19, both due to pandemic-specific hurdles and resource constraints within governments. Then, the paper provides an overview of OECD governments evaluation practices during COVID-19, with a specific
emphasis on the actors, the aims and the methods involved. Finally, the third and final section sets out lessons for future policy evaluations in light of advances made during the period, both for evaluating crisis responses, as well as for the evaluation field in general…(More)”.

E-Government Survey 2022


UN Report: “The United Nations E-Government Survey 2022 is the 12th edition of the United Nations’ assessment of the digital government landscape across all 193 Member States. The E-Government Survey is informed by over two decades of longitudinal research, with a ranking of countries based on the United Nations E-Government Development Index (EGDI), a combination of primary data (collected and owned by the United Nations Department of Economic and Social Affairs) and secondary data from other UN agencies

This edition of the Survey includes data analysis in global and regional contexts, a study of local e-government development based on the United Nations Local Online Service Index (LOSI), consideration of inclusion in the hybrid digital society, and a concluding chapter that outlines the trends and developments related to the future of digital government. As wish all editions, it features extensive annexes on its data, methodology and related pilot study initiatives…(More)”.

The Data Liberation Project 


About: “The Data Liberation Project is a new initiative I’m launching today to identify, obtain, reformat, clean, document, publish, and disseminate government datasets of public interest. Vast troves of government data are inaccessible to the people and communities who need them most. These datasets are inaccessible. The Process:

  • Identify: Through its own research, as well as through consultations with journalists, community groups, government-data experts, and others, the Data Liberation Project aims to identify a large number of datasets worth pursuing.
  • Obtain: The Data Liberation Project plans to use a wide range of methods to obtain the datasets, including via Freedom of Information Act requests, intervening in lawsuits, web-scraping, and advanced document parsing. To improve public knowledge about government data systems, the Data Liberation Project also files FOIA requests for essential metadata, such as database schemas, record layouts, data dictionaries, user guides, and glossaries.
  • Reformat: Many datasets are delivered to journalists and the public in difficult-to-use formats. Some may follow arcane conventions or require proprietary software to access, for instance. The Data Liberation Project will convert these datasets into open formats, and restructure them so that they can be more easily examined.
  • Clean: The Data Liberation Project will not alter the raw records it receives. But when the messiness of datasets inhibits their usefulness, the project will create secondary, “clean” versions of datasets that fix these problems.
  • Document: Datasets are meaningless without context, and practically useless without documentation. The Data Liberation Project will gather official documentation for each dataset into a central location. It will also fill observed gaps in the documentation through its own research, interviews, and analysis.
  • Disseminate: The Data Liberation Project will not expect reporters and other members of the public simply to stumble upon these datasets. Instead, it will reach out to the newsrooms and communities that stand to benefit most from the data. The project will host hands-on workshops, webinars, and other events to help others to understand and use the data.”…(More)”

OECD Guidelines for Citizen Participation Processes


OECD: “The OECD Guidelines for Citizen Participation Processes are intended for any public official or public institution interested in carrying out a citizen participation process. The guidelines describe ten steps for designing, planning, implementing and evaluating a citizen participation process, and discuss eight different methods for involving citizens: information and data, open meetings, public consultations, open innovation, citizen science, civic monitoring, participatory budgeting and representative deliberative processes. The guidelines are illustrated with examples as well as practical guidance built on evidence gathered by the OECD. Finally, nine guiding principles are presented to help ensure the quality of these processes…(More)”.

Applications of an Analytic Framework on Using Public Opinion Data for Solving Intelligence Problems


Report by the National Academies of Sciences, Engineering, and Medicine: “Measuring and analyzing public opinion comes with tremendous challenges, as evidenced by recent struggles to predict election outcomes and to anticipate mass mobilizations. The National Academies of Sciences, Engineering, and Medicine publication Measurement and Analysis of Public Opinion: An Analytic Framework presents in-depth information from experts on how to collect and glean insights from public opinion data, particularly in conditions where contextual issues call for applying caveats to those data. The Analytic Framework is designed specifically to help intelligence community analysts apply insights from the social and behavioral sciences on state-of-the-art approaches to analyze public attitudes in non- Western populations. Sponsored by the intelligence community, the National Academies’ Board on Behavioral, Cognitive, and Sensory Sciences hosted a 2-day hybrid workshop on March 8–9, 2022, to present the Analytic Framework and to demonstrate its application across a series of hypothetical scenarios that might arise for an intelligence analyst tasked with summarizing public attitudes to inform a policy decision. Workshop participants explored cutting-edge methods for using large-scale data as well as cultural and ethical considerations for the collection and use of public opinion data. This publication summarizes the presentations and discussions of the workshop…(More)”.

Building the analytic capacity to support critical technology strategy


Paper by Erica R.H. Fuchs: “Existing federal agencies relevant to the science and technology enterprise are appropriately focused on their missions, but the U.S. lacks the intellectual foundations, data infrastructure, and analytics to identify opportunities where the value of investment across missions (e.g., national security, economic prosperity, social well-being) is greater than the sum of its parts.

The U.S. government lacks systematic mechanisms to assess the nation’s strengths, weaknesses, and opportunities in technology and to assess the long chain of suppliers involved in producing products critical to national missions.

Two examples where modern data and analytics—leveraging star interdisciplinary talent from across the nation—and a cross-mission approach could transform outcomes include 1) the difficulties the federal government had in facilitating the production and distribution of personal protective equipment in spring 2020, and 2) the lack of clarity about the causes and solutions to the semiconductor shortage. Going forward, the scale-up of electric vehicles promises similar challenges…

The critical technology analytics (CTA) would identify 1) how emerging technologies and institutional innovations could potentially transform timely situational awareness of U.S. and global technology capabilities, 2) opportunities for innovation to transform U.S. domestic and international challenges, and 3) win-win opportunities across national missions. The program would be strategic and forward-looking, conducting work on a timeline of months and years rather than days and weeks, and would seek to generalize lessons from individual cases to inform the data and analytics capabilities that the government needs to build to support cross-mission critical technology policy…(More)”.

Towards a permanent citizens’ participatory mechanism in the EU


Report by Alberto Alemanno: “This study, commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the AFCO Committee, examines the EU participatory system and its existing participatory channels against mounting citizens’ expectations for greater participation in EU decision making in the aftermath of the Conference on the Future of Europe. It proposes the creation of a permanent deliberative mechanism entailing the participation of randomly selected citizens tasked to provide advice upon some of the proposals originating from either existing participation channels or the EU institutions, in an attempt at making the EU more democratically responsive…(More)”