Contractual Freedom and Fairness in EU Data Sharing Agreements


Paper by Thomas Margoni and Alain M. Strowel: “This chapter analyzes the evolving landscape of EU data-sharing agreements, particularly focusing on the balance between contractual freedom and fairness in the context of non-personal data. The discussion highlights the complexities introduced by recent EU legislation, such as the Data Act, Data Governance Act, and Open Data Directive, which collectively aim to regulate data markets and enhance data sharing. The chapter emphasizes how these laws impose obligations that limit contractual freedom to ensure fairness, particularly in business-to-business (B2B) and Internet of Things (IoT) data transactions. It also explores the tension between private ordering and public governance, suggesting that the EU’s approach marks a shift from property-based models to governance-based models in data regulation. This chapter underscores the significant impact these regulations will have on data contracts and the broader EU data economy…(More)”.

AI can help humans find common ground in democratic deliberation


Paper by Michael Henry Tessler et al: “We asked whether an AI system based on large language models (LLMs) could successfully capture the underlying shared perspectives of a group of human discussants by writing a “group statement” that the discussants would collectively endorse. Inspired by Jürgen Habermas’s theory of communicative action, we designed the “Habermas Machine” to iteratively generate group statements that were based on the personal opinions and critiques from individual users, with the goal of maximizing group approval ratings. Through successive rounds of human data collection, we used supervised fine-tuning and reward modeling to progressively enhance the Habermas Machine’s ability to capture shared perspectives. To evaluate the efficacy of AI-mediated deliberation, we conducted a series of experiments with over 5000 participants from the United Kingdom. These experiments investigated the impact of AI mediation on finding common ground, how the views of discussants changed across the process, the balance between minority and majority perspectives in group statements, and potential biases present in those statements. Lastly, we used the Habermas Machine for a virtual citizens’ assembly, assessing its ability to support deliberation on controversial issues within a demographically representative sample of UK residents…(More)”.

Cross-border data flows in Africa: Continental ambitions and political realities


Paper by Melody Musoni, Poorva Karkare and Chloe Teevan: “Africa must prioritise data usage and cross-border data sharing to realise the goals of the African Continental Free Trade Area and to drive innovation and AI development. Accessible and shareable data is essential for the growth and success of the digital economy, enabling innovations and economic opportunities, especially in a rapidly evolving landscape.

African countries, through the African Union (AU), have a common vision of sharing data across borders to boost economic growth. However, the adopted continental digital policies are often inconsistently applied at the national level, where some member states implement restrictive measures like data localisation that limit the free flow of data.

The paper looks at national policies that often prioritise domestic interests and how those conflict with continental goals. This is due to differences in political ideologies, socio-economic conditions, security concerns and economic priorities. This misalignment between national agendas and the broader AU strategy is shaped by each country’s unique context, as seen in the examples of Senegal, Nigeria and Mozambique, which face distinct challenges in implementing the continental vision.

The paper concludes with actionable recommendations for the AU, member states and the partnership with the European Union. It suggests that the AU enhances support for data-sharing initiatives and urges member states to focus on policy alignment, address data deficiencies, build data infrastructure and find new ways to use data. It also highlights how the EU can strengthen its support for Africa’s datasharing goals…(More)”.

A shared destiny for public sector data


Blog post by Shona Nicol: “As a data professional, it can sometime feel hard to get others interested in data. Perhaps like many in this profession, I can often express the importance and value of data for good in an overly technical way. However when our biggest challenges in Scotland include eradicating child poverty, growing the economy and tackling the climate emergency, I would argue that we should all take an interest in data because it’s going to be foundational in helping us solve these problems.

Data is already intrinsic to shaping our society and how services are delivered. And public sector data is a vital component in making sure that services for the people of Scotland are being delivered efficiently and effectively. Despite an ever growing awareness of the transformative power of data to improve the design and delivery of services, feedback from public sector staff shows that they can face difficulties when trying to influence colleagues and senior leaders around the need to invest in data.

A vision gap

In the Scottish Government’s data maturity programme and more widely, we regularly hear about the challenges data professionals encounter when trying to enact change. This community tell us that a long-term vision for public sector data for Scotland could help them by providing the context for what they are trying to achieve locally.

Earlier this year we started to scope how we might do this. We recognised that organisations are already working to deliver local and national strategies and policies that relate to data, so any vision had to be able to sit alongside those, be meaningful in different settings, agnostic of technology and relevant to any public sector organisation. We wanted to offer opportunities for alignment, not enforce an instruction manual…(More)”.

Emerging technologies in the humanitarian sector


Report and project by Rand: “Emerging technologies have often been explored in the humanitarian sector through small scale pilot projects, testing their application in a specific context with limited opportunities to replicate the testing across various contexts. The level of familiarity and knowledge of technological development varies across the specific types of humanitarian activities undertaken and technology areas considered.

The study team identified five promising technology areas for the humanitarian sector that could be further explored out to 2030:

  • Advanced manufacturing systems are likely to offer humanitarians opportunities to produce resources and tools in an operating environment characterised by scarcity, the rise of simultaneous crises, and exposure to more intense and severe climate events.
  • Early Warning Systems are likely to support preparedness and response efforts across the humanitarian sector while multifactorial crises are likely to arise.
  • Camp monitoring systems are likely to support efforts not only to address security risks, but also support planning and management activities of sites or the health and wellbeing of displaced populations.
  • Coordination platforms are likely to enhance data collection and information-sharing across various humanitarian stakeholders for the development of timely and bespoke crisis response.
  • Privacy-enhancing technologies (PETs) can support ongoing efforts to comply with increased data privacy and data protection requirements in a humanitarian operating environment in which data collection will remain necessary.

Beyond these five technology areas, the study team also considered three innovation journey opportunities:

  • The establishment of a technology horizon scanning coalition
  • Visioning for emerging technologies in crisis recovery
  • An emerging technology narrative initiative.

To accompany the deployment of specific technologies in the humanitarian sector, the study team also developed a four-step approach aimed to identify specific guidance needs for end-users and humanitarian practitioners…(More)”.

External Researcher Access to Closed Foundation Models


Report by Esme Harrington and Dr. Mathias Vermeulen: “…addresses a pressing issue: independent researchers need better conditions for accessing and studying the AI models that big companies have developed. Foundation models — the core technology behind many AI applications — are controlled mainly by a few major players who decide who can study or use them.

What’s the problem with access?

  • Limited access: Companies like OpenAI, Google and others are the gatekeepers. They often restrict access to researchers whose work aligns with their priorities, which means independent, public-interest research can be left out in the cold.
  • High-end costs: Even when access is granted, it often comes with a hefty price tag that smaller or less-funded teams can’t afford.
  • Lack of transparency: These companies don’t always share how their models are updated or moderated, making it nearly impossible for researchers to replicate studies or fully understand the technology.
  • Legal risks: When researchers try to scrutinize these models, they sometimes face legal threats if their work uncovers flaws or vulnerabilities in the AI systems.

The research suggests that companies need to offer more affordable and transparent access to improve AI research. Additionally, governments should provide legal protections for researchers, especially when they are acting in the public interest by investigating potential risks…(More)”.

Tech Agnostic


Book by Greg Epstein: “…Today’s technology has overtaken religion as the chief influence on twenty-first century life and community. In Tech Agnostic, Harvard and MIT’s influential humanist chaplain Greg Epstein explores what it means to be a critical thinker with respect to this new faith. Encouraging readers to reassert their common humanity beyond the seductive sheen of “tech,” this book argues for tech agnosticism—not worship—as a way of life. Without suggesting we return to a mythical pre-tech past, Epstein shows why we must maintain a freethinking critical perspective toward innovation until it proves itself worthy of our faith or not.

Epstein asks probing questions that center humanity at the heart of engineering: Who profits from an uncritical faith in technology? How can we remedy technology’s problems while retaining its benefits? Showing how unbelief has always served humanity, Epstein revisits the historical apostates, skeptics, mystics, Cassandras, heretics, and whistleblowers who embody the tech reformation we desperately need. He argues that we must learn how to collectively demand that technology serve our pursuit of human lives that are deeply worth living…(More)”.

The Number


Article by John Lanchester: “…The other pieces published in this series have human protagonists. This one doesn’t: The main character of this piece is not a person but a number. Like all the facts and numbers cited above, it comes from the federal government. It’s a very important number, which has for a century described economic reality, shaped political debate and determined the fate of presidents: the consumer price index.

The CPI is crucial for multiple reasons, and one of them is not because of what it is but what it represents. The gathering of data exemplifies our ambition for a stable, coherent society. The United States is an Enlightenment project based on the supremacy of reason; on the idea that things can be empirically tested; that there are self-evident truths; that liberty, progress and constitutional government walk arm in arm and together form the recipe for the ideal state. Statistics — numbers created by the state to help it understand itself and ultimately to govern itself — are not some side effect of that project but a central part of what government is and does…(More)”.

Key lesson of this year’s Nobel Prize: The importance of unlocking data responsibly to advance science and improve people’s lives


Article by Stefaan Verhulst, Anna Colom, and Marta Poblet: “This year’s Nobel Prize for Chemistry owes a lot to available, standardised, high quality data that can be reused to improve people’s lives. The winners, Prof David Baker from the University of Washington, and Demis Hassabis and John M. Jumper from Google DeepMind, were awarded respectively for the development and prediction of new proteins that can have important medical applications. These developments build on AI models that can predict protein structures in unprecedented ways. However, key to these models and their potential to unlock health discoveries is an open curated dataset with high quality and standardised data, something still rare despite the pace and scale of AI-driven development.

We live in a paradoxical time of both data abundance and data scarcity: a lot of data is being created and stored, but it tends to be inaccessible due to private interests and weak regulations. The challenge, then, is to prevent the misuse of data whilst avoiding its missed use.

The reuse of data remains limited in Europe, but a new set of regulations seeks to increase the possibilities of responsible data reuse. When the European Commission made the case for its European Data Strategy in 2020, it envisaged the European Union “a role model for a society empowered by data to make better decisions — in business and the public sector,” and acknowledged the need to improve “governance structures for handling data and to increase its pools of quality data available for use and reuse”…(More)”.

It is about time! Exploring the clashing timeframes of politics and public policy experiments


Paper by Ringa Raudla, Külli Sarapuu, Johanna Vallistu, and Nastassia Harbuzova: “Although existing studies on experimental policymaking have acknowledged the importance of the political setting in which policy experiments take place, we lack systematic knowledge on how various political dimensions affect experimental policymaking. In this article, we address a specific gap in the existing understanding of the politics of experimentation: how political timeframes influence experimental policymaking. Drawing on theoretical discussions on experimental policymaking, public policy, electoral politics, and mediatization of politics, we outline expectations about how electoral and problem cycles may influence the timing, design, and learning from policy experiments. We argue electoral timeframes are likely to discourage politicians from undertaking large-scale policy experiments and if politicians decide to launch experiments, they prefer shorter designs. The electoral cycle may lead politicians to draw too hasty conclusions or ignore the experiment’s results altogether. We expect problem cycles to shorten politicians’ time horizons further as there is pressure to solve problems quickly. We probe the plausibility of our theoretical expectations using interview data from two different country contexts: Estonia and Finland…(More)“.