Third Wave of Open Data


Paper (and site) by Stefaan G. Verhulst, Andrew Young, Andrew J. Zahuranec, Susan Ariel Aaronson, Ania Calderon, and Matt Gee on “How To Accelerate the Re-Use of Data for Public Interest Purposes While Ensuring Data Rights and Community Flourishing”: “The paper begins with a description of earlier waves of open data. Emerging from freedom of information laws adopted over the last half century, the First Wave of Open Data brought about newfound transparency, albeit one only available on request to an audience largely composed of journalists, lawyers, and activists. 

The Second Wave of Open Data, seeking to go beyond access to public records and inspired by the open source movement, called upon national governments to make their data open by default. Yet, this approach too had its limitations, leaving many data silos at the subnational level and in the private sector untouched..

The Third Wave of Open Data seeks to build on earlier successes and take into account lessons learned to help open data realize its transformative potential. Incorporating insights from various data experts, the paper describes the emergence of a Third Wave driven by the following goals:

  1. Publishing with Purpose by matching the supply of data with the demand for it, providing assets that match public interests;
  2. Fostering Partnerships and Data Collaboration by forging relationships with  community-based organizations, NGOs, small businesses, local governments, and others who understand how data can be translated into meaningful real-world action;
  3. Advancing Open Data at the Subnational Level by providing resources to cities, municipalities, states, and provinces to address the lack of subnational information in many regions.
  4. Prioritizing Data Responsibility and Data Rights by understanding the risks of using (and not using) data to promote and preserve the public’s general welfare.

Riding the Wave

Achieving these goals will not be an easy task and will require investments and interventions across the data ecosystem. The paper highlights eight actions that decision and policy makers can take to foster more equitable, impactful benefits… (More) (PDF) “

Co-Production of Public Services and Outcomes


Book by Elke Loeffler: “This book examines user and community co-production of public services and outcomes, currently one of the most discussed topics in the field of public management and policy. It considers co-production in a wide range of public services, with particular emphasis on health, social care and community safety, illustrated through international case studies in many of the chapters. This book draws on both quantitative and qualitative empirical research studies on co-production, and on the Governance International database of more than 70 international co-production case studies, most of which have been republished by the OECD. Academically rigorous and systematically evidence-based, the book incorporates many insights which have arisen from the extensive range of research projects and executive training programmes in co-production undertaken by the author. Written in a style which is easy and enjoyable to read, the book gives readers, both academics and practitioners, the opportunity to develop a creative understanding of the essence and implications of co-production….(More)”.

Cyber Republic


Book by George Zarkadakis: “Around the world, liberal democracies are in crisis. Citizens have lost faith in their government; right-wing nationalist movements frame the political debate. At the same time, economic inequality is increasing dramatically; digital technologies have created a new class of super-rich entrepreneurs. Automation threatens to transform the free economy into a zero-sum game in which capital wins and labor loses. But is this digital dystopia inevitable? In Cyber Republic, George Zarkadakis presents an alternative, outlining a plan for using technology to make liberal democracies more inclusive and the digital economy more equitable. Cyber Republic is no less than a guide for the coming Fourth Industrial Revolution and the post-pandemic world.

Zarkadakis, an expert on technology and management, explains how artificial intelligence, together with intelligent robotics, sophisticated sensors, communication networks, and big data, will fundamentally reshape the global economy; a new “intelligent machine age” will force us to adopt new forms of economic and political organization. He envisions a future liberal democracy in which intelligent machines facilitate citizen assemblies, helping to extend citizen rights, and blockchains and cryptoeconomics enable new forms of democratic governance and business collaboration. Moreover, the same technologies can be applied to scientific research and technological innovation. We need not fear automation, Zarkadakis argues; in a post-work future, intelligent machines can collaborate with humans to achieve the human goals of inclusivity and equality….(More)”.

Automating Society Report 2020


Bertelsmann Stiftung: “When launching the first edition of this report, we decided to  call  it  “Automating  Society”,  as ADM systems  in  Europe  were  mostly  new, experimental,  and  unmapped  –  and,  above all, the exception rather than the norm.

This situation has changed rapidly. As clearly shown by over 100 use cases of automated decision-making systems in 16 European countries, which have been compiled by a research network for the 2020 edition of the Automating Society report by Bertelsmann Stiftung and AlgorithmWatch. The report shows: Even though algorithmic systems are increasingly being used by public administration and private companies, there is still a lack of transparency, oversight and competence.

The stubborn opacity surrounding the ever-increasing use of ADM systems has made it all the more urgent that we continue to increase our efforts. Therefore, we have added four countries (Estonia, Greece, Portugal, and Switzerland) to the 12 we already analyzed in the previous edition of this report, bringing the total to 16 countries. While far from exhaustive, this allows us to provide a broader picture of the ADM scenario in Europe. Considering the impact these systems may have on everyday life, and how profoundly they challenge our intuitions – if not our norms and rules – about the relationship between democratic governance and automation, we believe this is an essential endeavor….(More)”.

Algorithm Tips


About: “Algorithm Tips is here to help you start investigating algorithmic decision-making power in society.

This site offers a database of leads which you can search and filter. It’s a curated set of algorithms being used across the US government at the federal, state, and local levels. You can subscribe to alerts for when new algorithms matching your interests are found. For details on our curation methodology see here.

We also provide resources such as example investigations, methodological tips, and guidelines for public records requests related to algorithms.

Finally, we blog about some of the more interesting examples of algorithms we’ve uncovered in our research….(More)”.

Statistical illiteracy isn’t a niche problem. During a pandemic, it can be fatal


Article by Carlo Rovelli: “In the institute where I used to work a few years ago, a rare non-infectious illness hit five colleagues in quick succession. There was a sense of alarm, and a hunt for the cause of the problem. In the past the building had been used as a biology lab, so we thought that there might be some sort of chemical contamination, but nothing was found. The level of apprehension grew. Some looked for work elsewhere.

One evening, at a dinner party, I mentioned these events to a friend who is a mathematician, and he burst out laughing. “There are 400 tiles on the floor of this room; if I throw 100 grains of rice into the air, will I find,” he asked us, “five grains on any one tile?” We replied in the negative: there was only one grain for every four tiles: not enough to have five on a single tile.

We were wrong. We tried numerous times, actually throwing the rice, and there was always a tile with two, three, four, even five or more grains on it. Why? Why would grains “flung randomly” not arrange themselves into good order, equidistant from each other?

Because they land, precisely, by chance, and there are always disorderly grains that fall on tiles where others have already gathered. Suddenly the strange case of the five ill colleagues seemed very different. Five grains of rice falling on the same tile does not mean that the tile possesses some kind of “rice-­attracting” force. Five people falling ill in a workplace did not mean that it must be contaminated. The institute where I worked was part of a university. We, know-­all professors, had fallen into a gross statistical error. We had become convinced that the “above average” number of sick people required an explanation. Some had even gone elsewhere, changing jobs for no good reason.

Life is full of stories such as this. Insufficient understanding of statistics is widespread. The current pandemic has forced us all to engage in probabilistic reasoning, from governments having to recommend behaviour on the basis of statistical predictions, to people estimating the probability of catching the virus while taking part in common activities. Our extensive statistical illiteracy is today particularly dangerous.

We use probabilistic reasoning every day, and most of us have a vague understanding of averages, variability and correlations. But we use them in an approximate fashion, often making errors. Statistics sharpen and refine these notions, giving them a precise definition, allowing us to reliably evaluate, for instance, whether a medicine or a building is dangerous or not.

Society would gain significant advantages if children were taught the fundamental ideas of probability theory and statistics: in simple form in primary school, and in greater depth in secondary school….(More)”.

Technology and Democracy: understanding the influence of online technologies on political behaviour and decision-making


Report by the Joint Research Center (EU): “…The report analyses the cognitive challenges posed by four pressure points: attention economy, platform choice architectures, algorithmic content curation and disinformation, and makes policy recommendations to address them.

Specific actions could include banning microtargeting for political ads, transparency rules so that users understand how an algorithm uses their data and to what effect, or requiring online platforms to provide reports to users showing when, how and which of their data is sold.

This report is the second output from the JRC’s Enlightenment 2.0 multi-annual research programme….(More)”.

A qualitative study of big data and the opioid epidemic: recommendations for data governance


Paper by Elizabeth A. Evans, Elizabeth Delorme, Karl Cyr & Daniel M. Goldstein: “The opioid epidemic has enabled rapid and unsurpassed use of big data on people with opioid use disorder to design initiatives to battle the public health crisis, generally without adequate input from impacted communities. Efforts informed by big data are saving lives, yielding significant benefits. Uses of big data may also undermine public trust in government and cause other unintended harms….

We conducted focus groups and interviews in 2019 with 39 big data stakeholders (gatekeepers, researchers, patient advocates) who had interest in or knowledge of the Public Health Data Warehouse maintained by the Massachusetts Department of Public Health.

Concerns regarding big data on opioid use are rooted in potential privacy infringements due to linkage of previously distinct data systems, increased profiling and surveillance capabilities, limitless lifespan, and lack of explicit informed consent. Also problematic is the inability of affected groups to control how big data are used, the potential of big data to increase stigmatization and discrimination of those affected despite data anonymization, and uses that ignore or perpetuate biases. Participants support big data processes that protect and respect patients and society, ensure justice, and foster patient and public trust in public institutions. Recommendations for ethical big data governance offer ways to narrow the big data divide (e.g., prioritize health equity, set off-limits topics/methods, recognize blind spots), enact shared data governance (e.g., establish community advisory boards), cultivate public trust and earn social license for big data uses (e.g., institute safeguards and other stewardship responsibilities, engage the public, communicate the greater good), and refocus ethical approaches.

Using big data to address the opioid epidemic poses ethical concerns which, if unaddressed, may undermine its benefits. Findings can inform guidelines on how to conduct ethical big data governance and in ways that protect and respect patients and society, ensure justice, and foster patient and public trust in public institutions….(More)”

Consumer Reports Study Finds Marketplace Demand for Privacy and Security


Press Release: “American consumers are increasingly concerned about privacy and data security when purchasing new products and services, which may be a competitive advantage to companies that take action towards these consumer values, a new Consumer Reports study finds. 

The new study, “Privacy Front and Center” from CR’s Digital Lab with support from Omidyar Network, looks at the commercial benefits for companies that differentiate their products based on privacy and data security. The study draws from a nationally representative CR survey of 5,085 adult U.S. residents conducted in February 2020, a meta-analysis of 25 years of public opinion studies, and a conjoint analysis that seeks to quantify how consumers weigh privacy and security in their hardware and software purchasing decisions. 

“This study shows that raising the standard for privacy and security is a win-win for consumers and the companies,” said Ben Moskowitz, the director of the Digital Lab at Consumer Reports. “Given the rapid proliferation of internet connected devices, the rise in data breaches and cyber attacks, and the demand from consumers for heightened privacy and security measures, there’s an undeniable business case for companies to invest in creating more private and secure products.” 

Here are some of the key findings from the study:

  • According to CR’s February 2020 nationally representative survey, 74% of consumers are at least moderately concerned about the privacy of their personal data.
  • Nearly all Americans (96%) agree that more should be done to ensure that companies protect the privacy of consumers.
  • A majority of smart product owners (62%) worry about potential loss of privacy when buying them for their home or family.
  • The privacy/security conscious consumer class seems to include more men and people of color.
  • Experiencing a data breach correlates with a higher willingness to pay for privacy, and 30% of Americans have experienced one.
  • Of the Android users who switched to iPhones, 32% indicated doing so because of Apple’s perceived privacy or security benefits relative to Android….(More)”.

Policy making in a digital world


Report by Lewis Lloyd: “…Policy makers across government lack the necessary skills and understanding to take advantage of digital technologies when tackling problems such as coronavirus and climate change. This report says already poor data management has been exacerbated by a lack of leadership, with the role of government chief data officer unfilled since 2017. These failings have been laid bare by the stuttering coronavirus Test and Trace programme. Drawing on interviews with policy experts and digital specialists inside and outside government, the report argues that better use of data and new technologies, such as artificial intelligence, would improve policy makers’ understanding of problems like coronavirus and climate change, and aid collaboration with colleagues, external organisations and the public in seeking solutions to them. It urges government to trial innovative applications of data and technology to ​a wider range of policies, but warns recent failures such as the A-level algorithm fiasco mean it must also do more to secure public trust in its use of such technologies. This means strengthening oversight and initiating a wider public debate about the appropriate use of digital technologies, and improving officials’ understanding of the limitations of data-driven analysis. The report recommends that the government:

  1. Appoints a chief data officer as soon as possible to drive work on improving data quality, tackle problems with legacy IT and make sure new data standards are applied and enforced across government.
  2. ​Places more emphasis on statistical and technological literacy when recruiting and training policy officials.
  3. Sets up a new independent body to lead on public engagement in policy making, with an initial focus on how and when government should use data and technology…(More)”.