Lessons from all democracies


David Stasavage at Aeon: “Today, many people see democracy as under threat in a way that only a decade ago seemed unimaginable. Following the fall of the Berlin Wall in 1989, it seemed like democracy was the way of the future. But nowadays, the state of democracy looks very different; we hear about ‘backsliding’ and ‘decay’ and other descriptions of a sort of creeping authoritarianism. Some long-established democracies, such as the United States, are witnessing a violation of governmental norms once thought secure, and this has culminated in the recent insurrection at the US Capitol. If democracy is a torch that shines for a time before then burning out – think of Classical Athens and Renaissance city republics – it all feels as if we might be heading toward a new period of darkness. What can we do to reverse this apparent trend and support democracy?

First, we must dispense with the idea that democracy is like a torch that gets passed from one leading society to another. The core feature of democracy – that those who rule can do so only with the consent of the people – wasn’t invented in one place at one time: it evolved independently in a great many human societies.

Over several millennia and across multiple continents, early democracy was an institution in which rulers governed jointly with councils and assemblies of the people. From the Huron (who called themselves the Wendats) and the Iroquois (who called themselves the Haudenosaunee) in the Northeastern Woodlands of North America, to the republics of Ancient India, to examples of city governance in ancient Mesopotamia, these councils and assemblies were common. Classical Greece provided particularly important instances of this democratic practice, and it’s true that the Greeks gave us a language for thinking about democracy, including the word demokratia itself. But they didn’t invent the practice. If we want to better understand the strengths and weaknesses of our modern democracies, then early democratic societies from around the world provide important lessons.

The core feature of early democracy was that the people had power, even if multiparty elections (today, often thought to be a definitive feature of democracy) didn’t happen. The people, or at least some significant fraction of them, exercised this power in many different ways. In some cases, a ruler was chosen by a council or assembly, and was limited to being first among equals. In other instances, a ruler inherited their position, but faced constraints to seek consent from the people before taking actions both large and small. The alternative to early democracy was autocracy, a system where one person ruled on their own via bureaucratic subordinates whom they had recruited and remunerated. The word ‘autocracy’ is a bit of a misnomer here in that no one in this position ever truly ruled on their own, but it does signify a different way of organising political power.

Early democratic governance is clearly apparent in some ancient societies in Mesopotamia as well as in India. It flourished in a number of places in the Americas before European conquest, such as among the Huron and the Iroquois in the Northeastern Woodlands and in the ‘Republic of Tlaxcala’ that abutted the Triple Alliance, more commonly known as the Aztec Empire. It was also common in precolonial Africa. In all of these societies there were several defining features that tended to reinforce early democracy: small scale, a need for rulers to depend on the people for knowledge, and finally the ability of members of society to exit to other locales if they were unhappy with a ruler. These three features were not always present in the same measure, but collectively they helped to underpin early democracy….(More)”

Female Victims of Gendered Violence, Their Human Rights and the Innovative Use of Data Technology to Predict, Prevent and Pursue Harms


Paper by Jamie Grace: “This short paper has the objective of making the case for more investment to explore the use of data-driven technology to predict, prevent and pursue criminal harms against women. The paper begins with an overview of the contemporary scale of the issues, and the current problem of recording data on serious violent and sexual offending against women, before moving on to consider the current status and strength of positive obligations under UK human rights law to protect victims of intimate partner violence. The paper then looks at some examples of how data tech can augment policing of serious criminal harms against women, before turning to consider some of the legal problems concerning potential bias, inaccuracies and transparency that can dog ‘predictive policing’ in particular. Finally, a conclusion is offered up which explores the degree to which investment and exploration of the predictive policing of intimate partner violence must be pursued at the same time as better oversight mechanisms are also developed for the use of machine learning technology in public protection roles, since the two emphases go hand in hand…(More)”.

Biden Creates Road Map for Equitable State and Local Data


Daniel Castro at GovTech: “On his first day in office, President Biden issued a flurry of administrative actions to reverse a number of President Trump’s policies and address the ongoing coronavirus pandemic. One of these included an executive order to advance racial equity and provide support for underserved communities. Notably, the order recognizes that achieving this goal will be difficult, if not impossible, without better data. This is a lesson that many state and local governments should take to heart by revisiting their collection policies to ensure data is equitable.

The executive order establishes that it is the policy of the Biden administration to “pursue a comprehensive approach to advancing equity for all, including people of color and others who have been historically underserved, marginalized, and adversely affected by persistent poverty and inequality.” To that end, the order dedicates a section to establishing an interagency working group on equitable data tasked with identifying inadequacies in federal data collection policies and programs, and recommending strategies for addressing any deficiencies.   

An inability to disaggregate data prevents policymakers from identifying disparate impacts of government programs on different populations in a variety of areas including health care, education, criminal justice, workforce and housing. Indeed, the U.S. Commission on Civil Rights has found that “data collection and reporting are essential to effective civil rights enforcement, and that a lack of effective civil rights data collection is problematic.”

This problem has repeatedly been on display throughout the COVID-19 pandemic. For example, at the outset of the pandemic last year, nearly half of states did not report data on race or ethnicity on those who were tested, hospitalized or died of COVID-19. And while the government has tried to take a data-driven response to the COVID-19 pandemic, a lack of data about different groups means that their needs are often hidden from policymakers….(More)”.

The Techlash and Tech Crisis Communication


Book by Nirit Weiss-Blatt: “This book provides an in-depth analysis of the evolution of tech journalism. The emerging tech-backlash is a story of pendulum swings: We are currently in tech-dystopianism after a long period spent in tech-utopianism. Tech companies were used to ‘cheerleading’ coverage of product launches. This long tech-press honeymoon ended, and was replaced by a new era of mounting criticism focused on tech’s negative impact on society. When and why did tech coverage shift? How did tech companies respond to the rise of tech criticism?

The book depicts three main eras: Pre-Techlash, Techlash, and Post-Techlash. The reader is taken on a journey from computer magazines, through tech blogs to the upsurge of tech investigative reporting. It illuminates the profound changes in the power dynamics between the media and the tech giants it covers.

The interplay between tech journalism and tech PR was underexplored. Through analyses of both tech media and the corporates’ crisis responses, this book examines the roots and characteristics of the Techlash, and provides explanations to ‘How did we get here?’. Insightful observations by tech journalists and tech public relations professionals are added to the research data, and together – they tell the story of the TECHLASH. It includes theoretical and practical implications for both tech enthusiasts and critics….(More)”.

Citizen social science in practice: the case of the Empty Houses Project


Paper by Alexandra Albert: “The growth of citizen science and participatory science, where non-professional scientists voluntarily participate in scientific activities, raises questions around the ownership and interpretation of data, issues of data quality and reliability, and new kinds of data literacy. Citizen social science (CSS), as an approach that bridges these fields, calls into question the way in which research is undertaken, as well as who can collect data, what data can be collected, and what such data can be used for. This article outlines a case study—the Empty Houses Project—to explore how CSS plays out in practice, and to reflect on the opportunities and challenges it presents. The Empty Houses Project was set up to investigate how citizens could be mobilised to collect data about empty houses in their local area, so as to potentially contribute towards tackling a pressing policy issue. The study shows how the possibilities of CSS exceed the dominant view of it as a new means of creating data repositories. Rather, it considers how the data produced in CSS is an epistemology, and a politics, not just a realist tool for analysis….(More)”.

Establishment of Sustainable Data Ecosystems


Report and Recommendations for the evolution of spatial data infrastructures by S. Martin, Gautier, P., Turki, and S., Kotsev: “The purpose of this study is to identify and analyse a set of successful data ecosystems and to address recommendations that can act as catalysts of data-driven innovation in line with the recently published European data strategy. The work presented here tries to identify to the largest extent possible actionable items.

Specifically, the study contributes with insights into the approaches that would help in the evolution of existing spatial data infrastructures (SDI), which are usually governed by the public sector and driven by data providers, to self-sustainable data ecosystems where different actors (including providers, users, intermediaries.) contribute and gain social and economic value in accordance with their specific objectives and incentives.

The overall approach described in this document is based on the identification and documentation of a set of case studies of existing data ecosystems and use cases for developing applications based on data coming from two or more data ecosystems, based on existing operational or experimental applications. Following a literature review on data ecosystem thinking and modelling, a framework consisting of three parts (Annex I) was designed. An ecosystem summary is drawn, giving an overall representation of the ecosystem key aspects. Two additional parts are detailed. One dedicated to ecosystem value dynamic illustrating how the ecosystem is structured through the resources exchanged between stakeholders, and the associated value.

Consequently, the ecosystem data flows represent the ecosystem from a complementary and more technical perspective, representing the flows and the data cycles associated to a given scenario. These two parts provide good proxies to evaluate the health and the maturity of a data ecosystem…(More)”.

The Ethics and Laws of Medical Big Data


Chapter by Hrefna Gunnarsdottir et al: “The COVID-19 pandemic has highlighted that leveraging medical big data can help to better predict and control outbreaks from the outset. However, there are still challenges to overcome in the 21st century to efficiently use medical big data, promote innovation and public health activities and, at the same time, adequately protect individuals’ privacy. The metaphor that property is a “bundle of sticks”, each representing a different right, applies equally to medical big data. Understanding medical big data in this way raises a number of questions, including: Who has the right to make money off its buying and selling, or is it inalienable? When does medical big data become sufficiently stripped of identifiers that the rights of an individual concerning the data disappear? How have different regimes such as the General Data Protection Regulation in Europe and the Health Insurance Portability and Accountability Act in the US answered these questions differently? In this chapter, we will discuss three topics: (1) privacy and data sharing, (2) informed consent, and (3) ownership. We will identify and examine ethical and legal challenges and make suggestions on how to address them. In our discussion of each of the topics, we will also give examples related to the use of medical big data during the COVID-19 pandemic, though the issues we raise extend far beyond it….(More)”.

The Flip Side of Free: Understanding the Economics of the Internet


Book by Michael Kende: “The upside of the Internet is free Wi-Fi at Starbucks, Facetime over long distances, and nearly unlimited data for downloading or streaming. The downside is that our data goes to companies that use it to make money, our financial information is exposed to hackers, and the market power of technology companies continues to increase. In The Flip Side of Free, Michael Kende shows that free Internet comes at a price. We’re beginning to realize this. Our all-purpose techno-caveat is “I love my smart speaker,” but is it really tracking everything I do? listening to everything I say?

Kende explains the unique economics of the Internet and the paradoxes that result. The most valuable companies in the world are now Internet companies, built on data often exchanged for free content and services. Many users know the impact of this trade-off on privacy but continue to use the services anyway. Moreover, although the Internet lowers barriers for companies to enter markets, it is hard to compete with the largest providers. We complain about companies having too much data, but developing countries without widespread Internet usage may suffer from the reverse: not enough data collection for the development of advanced services—which leads to a worsening data divide between developed and developing countries.

What’s the future of free? Data is the price of free service, and the new currency of the Internet age. There’s nothing necessarily wrong with free, Kende says, as long as we anticipate and try to mitigate what’s on the flip side…(More)”.

Policy 2.0 in the Pandemic World: What Worked, What Didn’t, and Why


Blog by David Osimo: “…So how, then, did these new tools perform when confronted with the once-in-a-lifetime crisis of a vast global pandemic?

It turns out, some things worked. Others didn’t. And the question of how these new policymaking tools functioned in the heat of battle is already generating valuable ammunition for future crises.

So what worked?

Policy modelling – an analytical framework designed to anticipate the impact of decisions by simulating the interaction of multiple agents in a system rather than just the independent actions of atomised and rational humans – took centre stage in the pandemic and emerged with reinforced importance in policymaking. Notably, it helped governments predict how and when to introduce lockdowns or open up. But even there uptake was limited. A recent survey showed that of the 28 models used in different countries to fight the pandemic were traditional, and not the modern “agent-based models” or “system dynamics” supposed to deal best with uncertainty. Meanwhile, the concepts of system science was becoming prominent and widely communicated. It became quickly clear in the course of the crisis that social distancing was more a method to reduce the systemic pressure on the health services than a way to avoid individual contagion (the so called “flatten the curve” project).

Open government data has long promised to allow citizens and businesses to build new services at scale and make government accountable. The pandemic largely confirmed how important this data could be to allow citizens to analyse things independently. Hundreds of analysts from all walks of life and disciplines used social media to discuss their analysis and predictions, many becoming household names and go-to people in countries and regions. Yes, this led to noise and a so-called “infodemic,” but overall it served as a fundamental tool to increase confidence and consensus behind the policy measures and to make governments accountable for their actions. For instance, one Catalan analyst demonstrated that vaccines were not provided during weekends and forced the government to change its stance. Yet it is also clear that not all went well, most notably on the supply side. Governments published data of low quality, either in PDF, with delays or with missing data due to spreadsheet abuse.

In most cases, there was little demand for sophisticated data publishing solutions such as “linked” or “FAIR” data, although particularly significant was the uptake of these kinds of solutions when it came time to share crucial research data. Experts argue that the trend towards open science has accelerated dramatically and irreversibly in the last year, as shown by the portal https://www.covid19dataportal.org/ which allowed sharing of high quality data for scientific research….

But other new policy tools proved less easy to use and ultimately ineffective. Collaborative governance, for one, promised to leverage the knowledge of thousands of citizens to improve public policies and services. In practice, methodologies aiming at involving citizens in decision making and service design were of little use. Decisions related to lockdown and opening up were taken in closed committees in top down mode. Individual exceptions certainly exist: Milan, one of the cities worst hit by the pandemic, launched a co-created strategy for opening up after the lockdown, receiving almost 3000 contributions to the consultation. But overall, such initiatives had limited impact and visibility. With regard to co-design of public services, in times of emergency there was no time for prototyping or focus groups. Services such as emergency financial relief had to be launched in a hurry and “just work.”

Citizen science promised to make every citizen a consensual data source for monitoring complex phenomena in real time through apps and Internet-of-Things sensors. In the pandemic, there were initially great expectations on digital contact tracing apps to allow for real time monitoring of contagions, most notably through bluetooth connections in the phone. However, they were mostly a disappointment. Citizens were reluctant to install them. And contact tracing soon appeared to be much more complicated – and human intensive – than originally thought. The huge debate between technology and privacy was followed by very limited impact. Much ado about nothing.

Behavioural economics (commonly known as nudge theory) is probably the most visible failure of the pandemic. It promised to move beyond traditional carrots (public funding) and sticks (regulation) in delivering policy objectives by adopting an experimental method to influence or “nudge” human behaviour towards desired outcomes. The reality is that soft nudges proved an ineffective alternative to hard lockdown choices. What makes it uniquely negative is that such methods took centre stage in the initial phase of the pandemic and particularly informed the United Kingdom’s lax approach in the first months on the basis of a hypothetical and unproven “behavioural fatigue.” This attracted heavy criticism towards the excessive reliance on nudges by the United Kingdom government, a legacy of Prime Minister David Cameron’s administration. The origin of such criticisms seems to lie not in the method shortcomings per se, which enjoyed success previously on more specific cases, but in the backlash from excessive expectations and promises, epitomised in the quote of a prominent behavioural economist: “It’s no longer a matter of supposition as it was in 2010 […] we can now say with a high degree of confidence these models give you best policy.

Three factors emerge as the key determinants behind success and failure: maturity, institutions and leadership….(More)”.

Intellectual Property and Artificial Intelligence


A literature review by the Joint Research Center: “Artificial intelligence has entered into the sphere of creativity and ingenuity. Recent headlines refer to paintings produced by machines, music performed or composed by algorithms or drugs discovered by computer programs. This paper discusses the possible implications of the development and adoption of this new technology in the intellectual property framework and presents the opinions expressed by practitioners and legal scholars in recent publications. The literature review, although not intended to be exhaustive, reveals a series of questions that call for further reflection. These concern the protection of artificial intelligence by intellectual property, the use of data to feed algorithms, the protection of the results generated by intelligent machines as well as the relationship between ethical requirements of transparency and explainability and the interests of rights holders….(More)”.