Implications of the use of artificial intelligence in public governance: A systematic literature review and a research agenda


Paper by Anneke Zuiderwijk, Yu-CheChen and Fadi Salem: “To lay the foundation for the special issue that this research article introduces, we present 1) a systematic review of existing literature on the implications of the use of Artificial Intelligence (AI) in public governance and 2) develop a research agenda. First, an assessment based on 26 articles on this topic reveals much exploratory, conceptual, qualitative, and practice-driven research in studies reflecting the increasing complexities of using AI in government – and the resulting implications, opportunities, and risks thereof for public governance. Second, based on both the literature review and the analysis of articles included in this special issue, we propose a research agenda comprising eight process-related recommendations and seven content-related recommendations. Process-wise, future research on the implications of the use of AI for public governance should move towards more public sector-focused, empirical, multidisciplinary, and explanatory research while focusing more on specific forms of AI rather than AI in general. Content-wise, our research agenda calls for the development of solid, multidisciplinary, theoretical foundations for the use of AI for public governance, as well as investigations of effective implementation, engagement, and communication plans for government strategies on AI use in the public sector. Finally, the research agenda calls for research into managing the risks of AI use in the public sector, governance modes possible for AI use in the public sector, performance and impact measurement of AI use in government, and impact evaluation of scaling-up AI usage in the public sector….(More)”.

We know what you did during lockdown


An FT Film written by James Graham: “The Covid-19 pandemic has so scrambled our lives that we have barely blinked when the state has told us how many people can attend a wedding, where we can travel or even whether we should hug each other. This normalisation of the abnormal, during the moral panic of a national healthcare emergency, is the subject of People You May Know, a short film written by the playwright James Graham and commissioned by the Financial Times.

One of Britain’s most inquisitive and versatile playwrights, Graham says he has long been worried about the expansion of the “creeping data state” and has an almost “existential anxiety about privacy on all levels, emotional, philosophical, political, social”. Those concerns were first explored in his play Privacy (2014) in response to the revelations of Edward Snowden, the US security contractor turned whistleblower, who described how “the architecture of oppression” of the surveillance state had been built, if not yet fully utilised. 

In his new FT film, Graham investigates how the response to the pandemic has enabled the further intrusion of the data state and what it might mean for us all. “The power of drama is that it allows you to take a few more stepping stones into the imagined future,” he says in a Google Meet interview. …(More) (Film)”

The Case for Better Governance of Children’s Data: A Manifesto


The Case for Better Governance of Children’s Data: A Manifesto

Report by Jasmina Byrne, Emma Day and Linda Raftree: “Every child is different, with unique identities and their capacities and circumstances evolve over their lifecycle. Children are more vulnerable than adults and are less able to understand the long-term implications of consenting to their data collection. For these reasons, children’s data deserve to be treated differently.

While responsible data use can underpin many benefits for children, ensuring that children are protected, empowered and granted control of their data is still a challenge.

To maximise the benefits of data use for children and to protect them from harm requires a new model of data governance that is fitting for the 21st century.

UNICEF has worked with 17 global experts to develop a Manifesto that articulates a vision for a better approach to children’s data.

This Manifesto includes key action points and a call for a governance model purposefully designed to deliver on the needs and rights of children. It is the first step in ensuring that children’s rights are given due weight in data governance legal frameworks and processes as they evolve around the world….(More)”

Ethiopia’s blockchain deal is a watershed moment – for the technology, and for Africa


Iwa Salami at The Conversation: “At the launch of bitcoin in 2009 the size of the potential of the underlying technology, the blockchain, was not fully appreciated.

What has not been fully exploited is the unique features of blockchain technology that can improve the lives of people and businesses. These include the fact that it is an open source software. This makes its source code legally and freely available to end-users who can use it to create new products and services. Another significant feature is that it is decentralised, democratising the operation of the services built on it. Control of the services built on the blockchain isn’t in the hands of an individual or a single entity but involves all those connected to the network.

In addition, it enables peer to peer interaction between those connected to the network. This is key as it enables parties to transact directly without using intermediaries or third parties. Finally, it has inbuilt security. Data stored on it is immutable and cannot be changed easily. New data can be added only after it is verified by everyone in the network.

Unfortunately, bitcoin, the project that introduced blockchain technology, has hogged the limelight, diverting attention from the technology’s underlying potential benefits….

But this is slowly changing.

A few companies have begun showcasing blockchain capabilities to various African countries. Unlike most other cryptocurrency blockchains which focus on private sector use in developed regions like Europe and North America, their approach has been to target the governments and public institutions in the developing world.

In April the Ethiopian government confirmed that it had signed a deal to create a national database of student and teacher IDs using a decentralised digital identity solution. The deal involves providing IDs for 5 million students across 3,500 schools which will be used to store educational records.

This is the largest blockchain deal ever to be signed by a government and has been making waves in the crypto-asset industry.

I believe that the deal marks a watershed moment for the use of blockchain and the crypto-asset industry, and for African economies because it offers the promise of blockchain being used for real socio-economic change. The deal means that blockchain technology will be used to provide digital identity to millions of Ethiopians. Digital identity – missing in most African countries – is the first step to real financial inclusion, which in turn has been shown to carry a host of benefits….(More)”.

Reimagining data responsibility: 10 new approaches toward a culture of trust in re-using data to address critical public needs


Commentary by Stefaan Verhulst in Data & Policy: “Data and data science offer tremendous potential to address some of our most intractable public problems (including the Covid-19 pandemic). At the same time, recent years have shown some of the risks of existing and emerging technologies. An updated framework is required to balance potential and risk, and to ensure that data is used responsibly. Data responsibility is not itself a new concept. However, amid a rapidly changing technology landscape, it has become increasingly clear that the concept may need updating, in order to keep up with new trends such as big data, open data, the Internet of things, and artificial intelligence, and machine learning. This paper seeks to outline 10 approaches and innovations for data responsibility in the 21st century….

10 New Approaches for Data Responsibility (Stefaan Verhulst)

Each of these is described at greater length in the paper, and illustrated with examples from around the world. Put together, they add up to a framework or outline for policy makers, scholars, and activists who seek to harness the potential of data to solve complex social problems and advance the public good. Needless to say, the 10 approaches outlined here represent just a start. We envision this paper more as an exercise in agenda-setting than a comprehensive survey…(More)”.

Tech for disabled people is booming around the world. So where’s the funding?


Article by Devi Lockwood: “Erick Ponce works in a government communications department in northern Ecuador. The 26-year-old happens to be deaf — a disability he has had since childhood. Communicating fluidly with his non-signing colleagues at work, and in public spaces like the supermarket, has been a lifelong challenge. 

In 2017, Ponce became one of the first users of an experimental app called SpeakLiz, developed by an Ecuadorian startup called Talov. It transforms written text to sound, transcribes spoken words, and can alert a deaf or hard-of-hearing person to sounds like that of an ambulance, motorcycles, music, or a crying baby. 

Once he began using SpeakLiz, Ponce’s coworkers — and his family — were able to understand him more easily. “You cannot imagine what it feels like to speak with your son after 20 years,” his father told the app’s engineers. Now a part of the Talov team, Ponce demos new products to make them better before they hit the market. 

The startup has launched two subscription apps on iOS and Android: SpeakLiz, in 2017, for the hearing impaired, and Vision, in 2019, for the visually impaired. Talov’s founders, Hugo Jácome and Carlos Obando, have been working on the apps for over five years. 

SpeakLiz and Vision are, by many measures, successful. Their software is used by more than 7,000 people in 81 countries and is available in 35 languages. The founders won an award from MIT Technology Review and a contest organized by the History Channel. Talov was named among the top 100 most innovative startups in Latin America in 2019. 

But the startup is still struggling. Venture capitalists aren’t knocking on its door. Jácome and Obando sold some of their possessions to raise enough money to launch, and the team has next to no funding to continue expanding.

Although the last few years have seen significant advances in technology and innovation for disabled people, critics say the market is undervalued….(More)”.

Lobbying in the 21st Century: Transparency, Integrity and Access


OECD Report: “Lobbying, as a way to influence and inform governments, has been part of democracy for at least two centuries, and remains a legitimate tool for influencing public policies. However, it carries risks of undue influence. Lobbying in the 21st century has also become increasingly complex, including new tools for influencing government, such as social media, and a wide range of actors, such as NGOs, think tanks and foreign governments. This report takes stock of the progress that countries have made in implementing the OECD Principles for Transparency and Integrity in Lobbying. It reflects on new challenges and risks related to the many ways special interest groups attempt to influence public policies, and reviews tools adopted by governments to effectively safeguard impartiality and fairness in the public decision-making process….(More)”.

Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis


A CDT Research report, entitled "Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis".
CDT Research report, entitled “Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis”.

Report by Dhanaraj Thakur and  Emma Llansó: “The ever-increasing amount of user-generated content online has led, in recent years, to an expansion in research and investment in automated content analysis tools. Scrutiny of automated content analysis has accelerated during the COVID-19 pandemic, as social networking services have placed a greater reliance on these tools due to concerns about health risks to their moderation staff from in-person work. At the same time, there are important policy debates around the world about how to improve content moderation while protecting free expression and privacy. In order to advance these debates, we need to understand the potential role of automated content analysis tools.

This paper explains the capabilities and limitations of tools for analyzing online multimedia content and highlights the potential risks of using these tools at scale without accounting for their limitations. It focuses on two main categories of tools: matching models and computer prediction models. Matching models include cryptographic and perceptual hashing, which compare user-generated content with existing and known content. Predictive models (including computer vision and computer audition) are machine learning techniques that aim to identify characteristics of new or previously unknown content….(More)”.

The Filing Cabinet


Essay by Craig Robertson: “The filing cabinet was critical to the information infrastructure of the 20th-century. Like most infrastructure, it was usually overlooked….The subject of this essay emerged by chance. I was researching the history of the U.S. passport, and had spent weeks at the National Archives, struggling through thousands of reels of unindexed microfilm records of 19th-century diplomatic correspondence; then I arrived at the records for 1906. That year, the State Department adopted a numerical filing system. Suddenly, every American diplomatic office began using the same number for passport correspondence, with decimal numbers subdividing issues and cases. Rather than scrolling through microfilm images of bound pages organized chronologically, I could go straight to passport-relevant information that had been gathered in one place.

I soon discovered that I had Elihu Root to thank for making my research easier. A lawyer whose clients included Andrew Carnegie, Root became secretary of state in 1905. But not long after he arrived, the prominent corporate lawyer described himself as “a man trying to conduct the business of a large metropolitan law-firm in the office of a village squire.”  The department’s record-keeping practices contributed to his frustration. As was then common in American offices, clerks used press books or copybooks to store incoming and outgoing correspondence in chronologically ordered bound volumes with limited indexing. For Root, the breaking point came when a request for a handful of letters resulted in several bulky volumes appearing on his desk. His response was swift: he demanded that a vertical filing system be adopted; soon the department was using a numerical subject-based filing system housed in filing cabinets. 

The shift from bound volumes to filing systems is a milestone in the history of classification; the contemporaneous shift to vertical filing cabinets is a milestone in the history of storage….(More)”.

Side-Stepping Safeguards, Data Journalists Are Doing Science Now


Article by Irineo Cabreros: “News stories are increasingly told through data. Witness the Covid-19 time series that decorate the homepages of every major news outlet; the red and blue heat maps of polling predictions that dominate the runup to elections; the splashy, interactive plots that dance across the screen.

As a statistician who handles data for a living, I welcome this change. News now speaks my favorite language, and the general public is developing a healthy appetite for data, too.

But many major news outlets are no longer just visualizing data, they are analyzing it in ever more sophisticated ways. For example, at the height of the second wave of Covid-19 cases in the United States, The New York Times ran a piece declaring that surging case numbers were not attributable to increased testing rates, despite President Trump’s claims to the contrary. The thrust of The Times’ argument was summarized by a series of plots that showed the actual rise in Covid-19 cases far outpacing what would be expected from increased testing alone. These weren’t simple visualizations; they involved assumptions and mathematical computations, and they provided the cornerstone for the article’s conclusion. The plots themselves weren’t sourced from an academic study (although the author on the byline of the piece is a computer science Ph.D. student); they were produced through “an analysis by The New York Times.”

The Times article was by no means an anomaly. News outlets have asserted, on the basis of in-house data analyses, that Covid-19 has killed nearly half a million more people than official records report; that Black and minority populations are overrepresented in the Covid-19 death toll; and that social distancing will usually outperform attempted quarantine. That last item, produced by The Washington Post and buoyed by in-house computer simulations, was the most read article in the history of the publication’s website, according to Washington Post media reporter Paul Farhi.

In my mind, a fine line has been crossed. Gone are the days when science journalism was like sports journalism, where the action was watched from the press box and simply conveyed. News outlets have stepped onto the field. They are doing the science themselves….(More)”.