Explore our articles
View All Results

Stefaan Verhulst

Book Review by Charles Carman: “One day, Mrs. Pengelley came to London seeking the assistance of Hercule Poirot, Agatha Christie’s Belgian detective with the mustache, whose “little grey cells” assist him in solving mysteries. With a troubled look, she tells him that she fears she is being slowly poisoned. The doctor doesn’t see anything much the matter, she says. He attributes the stomach trouble to gastritis. She even sometimes improves, but strangely this happens during the absence of someone in her life, confirming in her a certain suspicion.

After listening to her tale with great interest, Poirot agrees to take up the case. He sends the lady back and plans to catch a train the following day to begin his investigation. Discussing the matter with his close friend, Captain Hastings, Poirot admits the case is especially interesting, even though “it has positively no new features,” because “if I mistake not, we have here a very poignant human drama.”

When Poirot arrives the next day, he discovers that the lady has been murdered after unwittingly taking the final dose of poison. Having found the case intriguing enough to look into it, Poirot chastises himself, a “criminal imbecile,” for not having taken her story more seriously. “May the good God forgive me,” he declares, “but I never believed anything would happen at all. Her story seemed to me artificial.” Had he been convinced enough to return with her right away, he might have saved her. All that remains for him now is to catch the murderer.

“The Cornish Mystery” occurred to me while reading Paul Kingsnorth’s new collection of essays, Against the Machine: On the Unmaking of Humanity. In the story he weaves, a sinister force has been lurking for some time within our civilization, especially in the West. His suspicion falls upon something to do with science, technology, and how we misapprehend the world. It has been slowly sapping away at our life, creating problems that have been diagnosed as this or that malady and treated with such and such a remedy. Sometimes we feel better. And yet, we sense we are being dehumanized, unmade, that something essential is being destroyed piece by piece. Such a process is hard to pin down. This is the genius of murder by slow poisoning: it leads to doubt and misattribution. There is little ambiguity about a gunshot to the heart. Yet when killing dose by dose, one easily mistakes murderous intent with the body’s frailty, a lingering affliction, or incidental complications: murder disguised as natural causes…(More)”.

The Cassandra of ‘The Machine’

Open Access Book edited by Ines Mergel and Carsten Schmidt: “…explores how national libraries digitally transform their processes and services by using artificial intelligence and shows how they integrate co-creation strategies and provide actionable insights and recommendations for policymakers and library managers to help shape the future of libraries. It is the result of the LibrarIN project, a Horizon Europe initiative focused on reimagining library services through social innovation and the co-creation of public value.

The book comprises three different parts. The first part focuses on the introduction, research design, and description of expectations. The second part consists of twelve in-depth illustrations of AI projects in national libraries of the European Union, associated countries, and the Library of Congress, United States. These case studies demonstrate how national libraries co-created the digital transformation of their services by including their stakeholders in the AI implementation steps to preserve the national values and heritage. The third part includes recommendations for implementation and provides insights into a “toolkit” for policymakers and innovators in libraries…(More)”.

AI Innovations in Public Services: The Case of National Libraries

Book edited by Mariavittoria Catanzariti, Francesca Incardona, Giorgio Resta, and Anders Sönnerborg: “The recent adoption of the EU Regulation on the European Health Data Space is a significant development in European data law. While the need to protect the confidentiality of information and control over personal data — and, more generally, fundamental rights, particularly those of vulnerable people — is undeniable, the importance of using data for public interest, such as in healthcare and scientific research, has been brought to the fore by the Covid-19 pandemic.

This book addresses the controversial issues surrounding data sharing, including data protection, ownership and reuse, and the related ethical considerations. With contributions from experts in various fields, including medicine and law, it encourages interdisciplinary dialogue on the use of health data in Europe and beyond…(More)”.

Data Privacy, Data Property, and Data Sharing

Primer by the European Data Protection Supervisor: “The idea behind a Digital Identity Wallet (DIW) is to provide users with an easy way to store their identity data and credentials in a digital repository. This enables them to access services in both the physical and digital worlds while ensuring accountability for transactions.

The purpose of this TechDispatch is to introduce the concept of a DIW, understand the privacy risks that exist when using a DIW, and discuss relevant data protection by design and by default requirements and their implementation, including relevant technologies. Eventually, we assess how the European Digital Identity Wallet (EUDIW), mandated by the eIDAS 2 Regulation, fits within the framework outlined.

In general, we can identify four main actors within an identity management ecosystem: the users of a DIW, identity and attribute providers (IdPs), relying parties (RPs) and the scheme authority. Depending on the governance schema, other actors can also play a role. Various digital identity models have been developed over time and are currently in use. These include the isolated model, the centralised model, the federated model and the decentralised model, depending on the architecture of the schema and on the role of the IdP. We describe these models and assess their respective pros and cons in Chapter 2.

In a user-centric identity paradigm, where credentials are stored under the user’s control, such as with a DIW, there is no need for the RP to access the IdP to verify the user’s credentials with each request. This mitigates the risk that an IdP profiles users by observing and linking their transactions with different RPs.

DIW solutions are typically implemented through a combination of mobile applications and cloud infrastructure, and can be used for identification and authorisation, as well as for issuing and using digital signatures. These solutions use digital credentials constructed and protected by cryptographic techniques, called ‘verifiable credentials’. Well-designed DIWs can provide easy access to public and private services, and enhance users’ control and privacy, convenience, interoperability and data security…(More)”.

Digital Identity Wallets

OECD Paper: “Rigorous impact evaluations, particularly randomised trials, can provide governments with valuable insights into whether policies and programmes achieve their intended outcomes. Employed effectively, they offer an evidence base that goes beyond assumptions and precedent, supporting better resource allocation and more effective services for citizens. However, despite their potential, such evaluations remain underused in many contexts, given a number of technical and political barriers. This report explores how governments can overcome these barriers and deliver high-quality evaluations to contribute to policy development. It briefly discusses the potential for artificial intelligence to contribute to impact evaluation. It sets out the main evaluation methods available, ranging from randomised trials to quasi-experimental approaches, and highlights the conditions under which each can be applied. It addresses ethical issues and highlights how such concerns can be addressed through careful design and stakeholder engagement. There are also options for ensuring that evaluations can remain cost-effective, including through greater use of administrative data, alignment with policy priorities, and partnerships with wider networks. In particular, it underlines the value of international co-operation and peer learning to build capacity, share methods, and upscale effective programmes…(More)”.

Unleashing the policy potential of rigorous impact evaluation and randomised trials

JRC Policy Brief: “…outlines the EU’s strategic imperative to assert digital sovereignty while remaining open to global collaboration. Defined as the EU’s capacity to exercise strategic independence in the digital domain—encompassing data governance, infrastructure control, and innovation—digital sovereignty aims to reduce vulnerabilities in economic, security, and technological spheres. The brief emphasizes that this does not equate to isolation or protectionism but rather to strengthening EU competencies in critical areas such as semiconductors, cloud services, and AI, while aligning with democratic values like transparency and the rule of law. A multi-layered framework is proposed, structured across four interlinked dimensions: (1) Digital Governance, focusing on regulatory frameworks and international influence; (2) Digital Infrastructures, Software, and Data, emphasizing secure connectivity, cybersecurity, and data ecosystems; (3) Digital Products and Markets, addressing industrial competitiveness and fair competition; and (4) People, highlighting the need for digital literacy and citizen empowerment. The brief underscores both opportunities (e.g., EU-led initiatives like the Digital Services Act and EuroStack) and risks, including structural dependencies on non-EU providers, fragmented national strategies, and gaps in digital skills…(More)”.

Open but Not Powerless: Towards a Common Understanding of EU Digital Sovereignty

Book by Shannon Mattern: “Computational models of urbanism—smart cities that use data-driven planning and algorithmic administration—promise to deliver new urban efficiencies and conveniences. Yet these models limit our understanding of what we can know about a city. A City Is Not a Computer reveals how cities encompass myriad forms of local and indigenous intelligences and knowledge institutions, arguing that these resources are a vital supplement and corrective to increasingly prevalent algorithmic models.

Shannon Mattern begins by examining the ethical and ontological implications of urban technologies and computational models, discussing how they shape and in many cases profoundly limit our engagement with cities. She looks at the methods and underlying assumptions of data-driven urbanism, and demonstrates how the “city-as-computer” metaphor, which undergirds much of today’s urban policy and design, reduces place-based knowledge to information processing. Mattern then imagines how we might sustain institutions and infrastructures that constitute more diverse, open, inclusive urban forms. She shows how the public library functions as a steward of urban intelligence, and describes the scales of upkeep needed to sustain a city’s many moving parts, from spinning hard drives to bridge repairs…(More)”.

A City Is Not a Computer: Other Urban Intelligences

Article by Manas Tripathi and Ashish Bhasin: “Corporate restructuring, such as mergers, acquisitions, and bankruptcy, now raises complex data-ownership challenges for regulators, especially when activities cross borders and fall under multiple legal authorities. As organizations become more digital, controlling user data has become a core issue during restructuring. Policymakers must protect citizens’ data, evaluate the value of data assets, and ensure that competition rules are followed throughout the restructuring process. Although countries have strengthened rights such as the right to know and the right to be forgotten, many firms still exploit legal gaps to access or repurpose user data during restructuring. This article examines how organizations use these loopholes to shift or expand data ownership, often bypassing regulatory protections. Using a detailed case study, we uncover the blind spots in current oversight. To address these issues, we introduce the Data Ownership Governance for Corporate Restructuring (DOGCR) framework. The framework promotes accountability and offers a structured approach for managing data ownership transitions before, during, and after corporate restructuring…(More)”.

Whose data is it, anyway? Deliberating data ownership during corporate restructuring

Book by Joshua Gans: “It is well recognized that recent advances in AI are exclusively advances in statistical techniques for prediction. While this may facilitate automation, this result is secondary to AI’s impact on decision-making. From an economics perspective, predictions have their first-order impacts on the efficiency of decision-making.

In The Microeconomics of Artificial Intelligence, Joshua Gans examines AI as prediction that enhances and perhaps enables decision-making, focusing on the impacts that arise within firms or industries rather than broad economy-wide impacts on employment and productivity. He analyzes what the supply and production characteristics of AI are and what the drivers of the demand for AI prediction are. Putting these together, he explores how supply and demand conditions lead to a price for predictions and how this price is shaped by market structure. Finally, from a microeconomics perspective, he explores the key policy trade-offs for antitrust, privacy, and other regulations…(More)”.

The Microeconomics of Artificial Intelligence

Essay by Amelia Acker: “A series of exploratory case studies were conducted throughout the 1960s to research centralizing access to government data. In response, social and behavioral researchers—both within and outside the federal government—proposed what came to be known as the National Data Center. The proposal prompted several congressional hearings in the House and Senate throughout 1966. Led by Congressman Cornelius Gallagher and Senator Edward V. Long, respectively, the hearings addressed the possible invasion of privacy that would result from a data center using computer technology and automated recordkeeping to manage data gathered from the public. According to privacy scholar Priscilla Regan, “Congress’s first discussions concerning computerized record systems cast the issue in terms of the idea that individual privacy was threatened by technological change.” But, as the hearings continued and critiques in the press began to circulate, concerns shifted from focusing on the potential impacts of new computing technology on data processing to the sheer volume of information being collected about individuals—some three billion records, according to a Senate subcommittee report.

By the end of the year, the congressional inquiries exploded into a full-blown controversy, and as one observer wrote in 1967, the plan for the National Data Center “acquired the image of a design to establish a gargantuan centralized national data center calculated to bring Orwell’s 1984 at least as close as 1970.” These fears about files with personal information being aggregated into dossiers and made accessible through computers would shape data protections in the United States for decades to come…(More)”.

How “Archive” Became a Verb

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday