Explore our articles
View All Results

Stefaan Verhulst

Article by David Elliott: “How do people today stay informed about what’s happening in the world? In most countries, TV, print and websites are becoming less popular, according to a report from the Reuters Institute.

The 2025 Digital News Report, which distills data from six continents and 48 markets, finds that these traditional news media sources are struggling to connect with the public, with declining engagement, low trust and stagnating subscriptions.

So where are people getting their news in 2025? And what might be the impact of these shifts?

Amid political and economic uncertainty, the climate crisis and ongoing conflicts around the world, there is certainly no lack of stories to report on. But audiences are continuing to go to new places to find them – namely, social media, video platforms and online aggregators.

Social media use for news is rising across many countries, although this is more pronounced in the United States, Latin America, Africa and some Southeast Asian countries. In the US, for example, the proportion of people that say social media is their main source of news has risen significantly in the past decade, from around 4% in 2015 to 34% in 2025. The proportion of people accessing news via social media and video networks in the US overtook both TV news and news websites for the first time.

Graphs showing the proportion that say social media is their main source of news.

In many European countries, traditional news sources have been more resilient but social media use for news is still rising. In the UK and France, for example, about a fifth of people in each country now use social media as their primary news source compared to well below 10% a decade ago.

Across all of the markets studied by the report, the proportion consuming video continues to grow. And dependence on social media and video networks for news is highest with younger groups – 44% of 18- to 24-year-olds and 38% of 25- to 34-year-olds say these are their main sources of news…(More)”.

This is how people in 2025 are getting their news

Book Review by Gordon LaForge of “The Technological Republic: Hard Power, Soft Belief, and the Future of the West By Alexander C. Karp and Nicholas W. Zamiska”: “…Karp laments that the government has stepped away from technology development, “a remarkable and near-total placement of faith in the market.” In his view, Silicon Valley, which owes its existence to federal investment and worked hand-in-glove with the state to produce the breakthroughs of the post-Sputnik era, has “lost its way.” Instead, founders who claim to want to change the world have created food-delivery apps, photo-sharing platforms, and other trivial consumer products.

Less resonant is Karp’s diagnosis of the source of the problem. In his view, America’s tech leaders have become soft and timid. They fear doing anything that might invite controversy or disapproval, like taking on a military contract or supporting a national mission. They are of a generation that has abandoned “belief or conviction in broader political projects,” he writes, trained to simply mimic what has come before and conform to prevailing sentiment.

This all has its roots, Karp argues, in a “systematic attack and attempt to dismantle any conception of American or Western identity during the 1960s and 1970s.”…

Karp’s claims feel divorced from reality. Debates about justice and national identity run riot in America today. A glance at Elon Musk’s X feed or Meta’s content moderation policies dispels the idea that controversy avoidance is the tech industry’s North Star. Internal contradictions in Karp’s argument abound. For instance, in one part of the book he criticizes tech leaders for sheep-like conformity, while in another he lionizes the “unwillingness to conform” as the quintessence of Silicon Valley. It doesn’t help that Karp makes his case not so much with evidence but with repetition of his claims and biographical snippets of historical figures.

Karp’s preoccupation with what he calls “soft belief” misses the deeper structural reality. Innovation is not merely a function of the mindset of individual founders; it depends on an ecosystem of public and private institutions—tax policy, regulations, the financial system, education, labor markets, and so on. In the United States, the public aspects of that ecosystem have weakened over time, while the private sector and its attendant interests have flourished….Karp’s treatise seems to spring from a belief that he expressed in a February earnings call: “Whatever is good for America will be good for Americans and very good for Palantir.” This conflation of the gains of private companies with the good of the country explains much of what’s gone wrong in the United States today—whether in technological innovation or elsewhere…(More)”.

Where Are the Moonshots?

Report comparing EU, US and Chinese approaches by the European Commission: “This policy brief focusses on technology monitoring and assessment (TMA). TMA is important for R&I policy orientation, and has significant socioeconomic impacts, especially in terms of emerging technologies. After outlining the advantages and features of TMA systems, this brief compares the TMA systems in China, the US and the EU, and outlines the structural and methodological challenges faced by the EU TMA approach. This brief concludes by providing recommendations to EU policymakers to transform the EU TMA system into a distinct advantage in the competition for global innovation leadership…(More)”.

Technology monitoring and assessment

Paper by Ioannis Lianos: “The EU legal framework for data access and portability has undergone significant evolution, particularly in the realm odata, with recent initiatives like the European Health Data Space (EHDS) and competition law enforcement expanding data-sharing obligations across various economic actors. This evolution reflects a shift from an initial emphasis on individuals’ fundamental rights to access and port their health data—rooted in privacy protection, personal data rights, and digital sovereignty—towards a more utilitarian perspective. This newer approach extends data-sharing obligations to cover co-generated data involving end-users, business users, and complementors within digital health ecosystems, promoting a concept of data co-use or co-ownership rather than private ownership. Furthermore, the regulatory framework has proactively established ‘data commons’ to foster cumulative innovation and broader industry transformation. The increasing prominence of a fairness rhetoric in EU regulatory and competition law underscores a transformational intent, aiming not only to acknowledge stakeholders’ contributions to data generation but also to ensure equal economic opportunities within the digital health space and facilitate the EU’s digital transition. This study adopts a law and political economy perspective to examine the competition-related bottleneck issues specific to health data, considering the economic structure of its generation, capture, and exploitation. It then analyses the distributive implications of current regulations (including the DMA, Data Act, EHDS, Digital Governance Act, and Competition Law) by exploring relationships between key economic players: digital platforms and end users, platforms and their ecosystem complementors, and external third-party businesses interacting with the digital health ecosystem…(More)”.

Access to Health Data, Competition, and Regulatory Alternatives: Three Dimensions of Fairness 

Conference Proceedings edited by Josef Drexl, Moritz Hennemann, Patricia Boshe,  and Klaus Wiedemann: “The increasing relevance of data is now recognized all over the world. The large number of regulatory acts and proposals in the field of data law serves as a testament to the significance of data processing for the economies of the world. The European Union’s Data Strategy, the African Union’s Data Policy Framework and the Australian Data Strategy only serve as examples within a plethora of regulatory actions. Yet, the purposeful and sensible use of data does not only play a role in economic terms, e.g. regarding the welfare or competitiveness of economies. The implications for society and the common good are at least equally relevant. For instance, data processing is an integral part of modern research methodology and can thus help to address the problems the world is facing today, such as climate change.

The conference was the third and final event of the Global Data Law Conference Series. Legal scholars from all over the world met, presented and exchanged their experiences on different data-related regulatory approaches. Various instruments and approaches to the regulation of data – personal or non-personal – were discussed, without losing sight of the global effects going hand-in-hand with different kinds of regulation.

In compiling the conference proceedings, this book does not only aim at providing a critical and analytical assessment of the status quo of data law in different countries today, it also aims at providing a forward-looking perspective on the pressing issues of our time, such as: How to promote sensible data sharing and purposeful data governance? Under which circumstances, if ever, do data localisation requirements make sense? How – and by whom – should international regulation be put in place? The proceedings engage in a discussion on future-oriented ideas and actions, thereby promoting a constructive and sensible approach to data law around the world…(More)”.

Comparative Data Law

Toolkit by the Broadband Commission Working Group on Data Governance: “.. the Toolkit serves as a practical, capacity-building resource for policymakers, regulators, and governments. It offers actionable guidance on key data governance priorities — including legal frameworks, institutional roles, cross-border data flows, digital self-determination, and data for AI.

As a key capacity building resource, the Toolkit aims to empower policymakers, regulators and data practitioners to navigate the complexities of data governance in the digital era. Plans are currently underway to translate the Toolkit into French, Spanish, Chinese, and Arabic to ensure broader global accessibility and impact. Pilot implementation at country level is also being explored for Q4 2025 to support national-level uptake.   

The Data Governance Toolkit

The Data Governance Toolkit: Navigating Data in the Digital Age offers a practical, rights-based guide to help governments, institutions, and stakeholders make data work for all.  

The Toolkit is organized around four foundational data governance components—referred to as the 4Ps of Data Governance: 

  • Why (Purpose): How to define a vision and purpose for data governance in the context of AI, digital transformation, and sustainable development. 
  • How (Principles): What principles should guide a governance framework to balance innovation, security, and ethical considerations. 
  • Who (People and Processes): Identifying the stakeholders, institutions, and processes required to build and enforce responsible governance structures. 
  • What (Practices and Mechanisms): Policies and best practices to manage data across its entire lifecycle while ensuring privacy, interoperability, and regulatory compliance.
Data governance framework

The Toolkit also includes: 

  • A self-assessment framework to help organizations evaluate their current capabilities; 
  • A glossary of key terms to foster shared understanding;  
  • A curated list of other toolkits and frameworks for deeper engagement. 

Designed to be adaptable across regions and sectors, the Data Governance Toolkit is not a one-size-fits-all manual—but a modular resource to guide smarter, safer, and fairer data use in the digital age…(More)”

Data Governance Toolkit: Navigating Data in the Digital Age

Report by Ofcom (UK): “…We outline three potential policy options and models for facilitating greater researcher access, which include:

  1. Clarify existing legal rules: Relevant authorities, could provide additional guidance on what is already legally permitted for researcher access on important issues, such as data donations and research-related scraping.
  2. Create new duties, enforced by a backstop regulator: Services could be required to put in place systems and processes to operationalise data access. This could include new duties on regulated services to create standard procedures for researcher accreditation. Services would be responsible for providing researchers with data directly or providing the interface through which they can access it and offering appeal and redress mechanisms. A backstop regulator could enforce these duties – either an existing or new body. 
  3. Enable and manage access via independent intermediary: New legal powers could be granted to a trusted third party which would facilitate and manage researchers’ access to data. This intermediary – which could again be an existing or new body – would accredit researchers and provide secure access.

Our report describes three types of intermediary that could be considered – direct access intermediary, notice to service intermediary and repository intermediary models.

  • Direct access intermediary. Researchers could request data with an intermediary facilitating secure access. In this model, services could retain responsibility for hosting and providing data while the intermediary maintains the interface by which researchers request access.
  • Notice to service intermediary. Researchers could apply for accreditation and request access to specific datasets via the intermediary. This could include data that would not be accessible in direct access models. The intermediary would review and refuse or approve access. Services would then be required to provide access to the approved data.
  • Repository intermediary. The intermediary could itself provide direct access to data, by providing an interface for data access and/or hosting the data itself and taking responsibility for data governance. This could also include data that would not be accessible in direct access models…(More)”.
Researchers’ access to information from regulated online services

Article by Kelly Ommundsen: “We are living through one of the most transformative moments in human history. Technologies like artificial intelligence (AI), quantum computing and synthetic biology are accelerating change at a pace few institutions are prepared to manage. Yet while innovation is leaping forward, regulation often remains standing still – constrained by outdated models, fragmented approaches and a reactive mindset…

To address this growing challenge, the World Economic Forum, in collaboration with the UAE’s General Secretariat of the Cabinet, has launched the Global Regulatory Innovation Platform (GRIP).

GRIP is a new initiative designed to foster human-centred, forward-looking and globally coordinated approaches to regulation. Its goal: to build trust, reduce uncertainty and accelerate innovation that serves the public good.

This platform builds on the World Economic Forum’s broader body of work on agile governance. As outlined in the Forum’s 2020 report, Agile Governance: Reimagining Policy-making in the Fourth Industrial Revolution, traditional regulatory approaches – characterized by top-down control and infrequent updates – are increasingly unfit for the pace, scale and complexity of modern technological change…(More)”.

How a new platform is future-proofing governance for the intelligent age

Book edited by Cecilia Biancalana and Eric Montigny: “Democracy and data have a complicated relationship. Under the influence of big data and artificial intelligence, some democracies are being transformed as relations between citizens, political parties, governments, and corporations are gradually redrawn.

Artificial Democracy explores the ways in which data collection and analytics and their application are changing political practices, government policies, and even democratic policies themselves. With an international roster of multidisciplinary contributors, this topical collection takes a comprehensive approach to big data’s effect on democracy, from the use of micro-targeting in electoral campaigns to the clash between privacy and surveillance in the name of protecting society.

The book tackles both the dangers and the potentially desirable changes made possible by the symbiosis of big data and artificial intelligence. It explores shifts in how we conceptualize the citizen-government relationship and asks important questions about where we could be heading…(More)”.

Artificial Democracy: The Impact of Big Data on Politics, Policy, and Polity

Article by Eric Holthaus: “A critical US atmospheric data collection program will be halted by Monday, giving weather forecasters just days to prepare, according to a public notice sent this week. Scientists that the Guardian spoke with say the change could set hurricane forecasting back “decades”, just as this year’s season ramps up.

In a National Oceanic and Atmospheric Administration (Noaa) message sent on Wednesday to its scientists, the agency said that “due to recent service changes” the Defense Meteorological Satellite Program (DMSP) will “discontinue ingest, processing and distribution of all DMSP data no later than June 30, 2025”.

Due to their unique characteristics and ability to map the entire world twice a day with extremely high resolution, the three DMSP satellites are a primary source of information for scientists to monitor Arctic sea ice and hurricane development. The DMSP partners with Noaa to make weather data collected from the satellites publicly available.

The reasons for the changes, and which agency was driving them, were not immediately clear. Noaa said they would not affect the quality of forecasting.

However, the Guardian spoke with several scientists inside and outside of the US government whose work depends on the DMSP, and all said there are no other US programs that can form an adequate replacement for its data.

“We’re a bit blind now,” said Allison Wing, a hurricane researcher at Florida State University. Wing said the DMSP satellites are the only ones that let scientists see inside the clouds of developing hurricanes, giving them a critical edge in forecasting that now may be jeopardized.

“Before these types of satellites were present, there would often be situations where you’d wake up in the morning and have a big surprise about what the hurricane looked like,” said Wing. “Given increases in hurricane intensity and increasing prevalence towards rapid intensification in recent years, it’s not a good time to have less information.”..(More)”.

Sudden loss of key US satellite data could send hurricane forecasting back ‘decades’

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday