Stefaan Verhulst
Article by Isobel Moure, Tim O’Reilly and Ilan Strauss: “Can we head off AI monopolies before they harden? As AI models become commoditized, incumbent Big Tech platforms are racing to rebuild their moats at the application layer, around context: the sticky user- and project-level data that makes AI applications genuinely useful. With the right context-aware AI applications, each additional user-chatbot conversation, file upload, or coding interaction improves results; better results attract more users; and more users mean more data. This context flywheel — a rich, structured user- and project-data layer — can drive up switching costs, creating a lock-in effect that effectively traps accumulated data within the platform.
Protocols prevent lock-in. We argue that open protocols — exemplified by Anthropic’s Model Context Protocol (MCP) — serve as a powerful rulebook, helping to keep API-exposed context fluid and to prevent Big Tech from using data lock-in to extend their monopoly power. However, as an API wrapper, MCP can access only what a particular service (such as GitHub or Slack) happens to expose through its API.
To fully enable open, healthy, and competitive AI markets, we need complementary measures that ensure protocols can access the full spectrum of user context, including through:
1. Guaranteed access, for authorized developers, to user-owned data, through open APIs at major platforms.
2. Portable memory that separates a user’s agentic memory from specific applications.
3. Guardrails governing how AI services can leverage user data.
Drawing on the example of open-banking regulations, we show that security and data standards are required for any of these proposals to be realized.
Architecting an open, interoperable AI stack through the protocol layer is about supporting broad value creation rather than value capture by a few firms. Policy efforts such as the EU’s General-Purpose AI Code of Practice do matter; but, ultimately, it is software architecture that most immediately and decisively shapes market outcomes. Protocols — the shared standards that let different systems communicate with one another — function as a deeper de facto law, enabling independent, decentralized, and secure action in digital markets…(More)”.
Article by Michelle Nichols: “A United Nations report seeking ways to improve efficiency and cut costs has revealed: U.N. reports are not widely read.
U.N. Secretary-General Antonio Guterres briefed countries on Friday on the report, produced by his UN80 reform that focused on how U.N. staff implement thousands of mandates given to them by bodies like the General Assembly or Security Council. He said last year that the U.N. system supported 27,000 meetings involving 240 bodies, and the U.N. secretariat produced 1,100 reports, a 20% increase since 1990. “The sheer number of meetings and reports is pushing the system – and all of us – to the breaking point,” Guterres said.
“Many of these reports are not widely read,” he said. “The top 5% of reports are downloaded over 5,500 times, while one in five reports receives fewer than 1,000 downloads. And downloading doesn’t necessarily mean reading.”..(More)”.
Paper by Charles Reuven Starobin Hatfield et al: “The study set out with several key objectives:
1. To analyze traffic congestion during school terms versus holidays—responding to anecdotal evidence suggesting school-term congestion is a problem despite the absence of formal analysis
2. To assess the impact of school commutes on overall citywide congestion, and
3. To explore the broader equity and economic implications of this congestion.
The analysis utilized the Uber Movement dataset from 2019, which covers 98% of Nairobi’s motorways, primary, secondary, and trunk roads; 88.7% of tertiary roads; and 9.5% of residential roads. The study focused on three school terms and three corresponding holiday periods, intentionally excluding public holidays and weekends to isolate the school-related traffic impact. The primary temporal focus was the morning rush hour, defined as 6 a.m. to 9 a.m.
The approach relied heavily on Uber Movement data for both exploratory and in-depth analysis of congestion during morning hours. The analytical steps included hourly and daily traffic analysis, binomial analysis of the most and least congested roads, travel time loss modeling, statistical evaluation, and interpretation supported by local knowledge. The results from the exploratory hourly analysis showed significant morning rush hour congestion during school terms, with sharp speed declines in the early morning hours, pointing to capacity challenges in the road network.
Daily traffic pattern analysis revealed distinct seasonal variations, varied congestion patterns during school terms, and elevated travel speeds on weekends and holidays. The binomial analysis highlighted an unequal distribution of congestion across Nairobi, with structural overburdening observed on arterial roads, while motorways and primary roads appeared less affected. Statistical testing confirmed that differences between school term and holiday periods were statistically significant—even after controlling for spatial and temporal autocorrelation…(More)”.
Report by the American Statistical Association: “Since January 2025, Executive Branch initiatives have resulted in cut or frozen contracts for statistical services and significant decreases in statistical staff through layoffs and aggressively incentivizing employees to leave federal service. Loss of staff means the loss of priceless institutional knowledge and hinders statistical agencies’ ability to fulfill their legal obligations. Some agencies—such as the National Center for Education Statistics (NCES), reduced to only three staff in March—have been hit especially hard, while other agencies have suffered varying degrees of damage from broader budget cuts and hiring freezes. As a result, agencies have cut data products and programs and are weakened in their ability to collect and maintain reliable data going forward. A possible instance of likely improper political influence according to Politico occurred when the U.S. Department of Agriculture (USDA) held up the release of a report from the Economic Research Service (ERS) based on concerns it forecasted an increase in the agricultural trade deficit. Additionally, it is not clear what the future implications are of replacing the career civil service Chief Statistician of the United States on July 11 with a political appointee. Many have advocated for higher visibility of the chief statistician position within the U.S. Office of Management and Budget (OMB), but this change was done without any consultation with the user community, which has raised questions…(More)”.
Article by Guru Madhavan: “For centuries, when people wanted to describe a technology they spoke of “inventions” or “the useful arts”. In early English usage, “technology” referred to a treatise on technical subjects, not the tools themselves. Its modern usage — which covers everything from toothpicks to Teslas, and TikTok to tomahawks — gained ground in the late 19th and early 20th centuries, as engineering aligned itself with scientific authority and institutional prestige.
The result is that “technology” has become a bloated umbrella, spanning too much and clarifying too little. Nor is it alone in this semantic stampede. Words like “innovation,” “smart” and “sustainability” have suffered similar dilution, sprayed across policy memos and pitch decks until their edges blur.
Vagueness offers alluring flexibility. Consultants peddle “technology solutions” to undefined problems. More troubling, “tech company” has become a convenient shield. Social platforms claim they are not publishers. Ride-hailing services say they are not employers. Online marketplaces avoid retail classification. Without distinction, accountability drifts.
Precision still works, but only when we allow it. “Biotech” emerged from the haze once investors needed a way to separate pills from pixels. “Fintech” spans wire transfers and crypto speculation. “Edtech” includes tutoring apps and loan servicers. “Agtech” groups robotic milkers with gene-edited crops. “Cleantech” wraps battery storage, algae farms and “clean coal” under the same feel-good brand. These tags reproduce confusion at smaller scales.
Language, itself a technology, shapes how we understand agency. “Technology is changing the workplace” conceals the fact that it is executives who are choosing to automate processes and cut jobs. “Technology connects us” hides the deliberate design of attention-harvesting systems. This framing presents human decisions as inevitable shifts.
At its most insidious, “technology” flattens innovation itself. Even “science and technology” — a favourite handle in policy circles — makes engineers wince. It confuses product with process and presents tools as if they have emerged from theory alone. When every advancement, from a hair dryer to the Hoover Dam, shares one designation, differences in form, intent and consequence fade.
The Greeks had it right. Technē meant skilled craft, guided by prudence and purpose. It described something made, named, practised and held to account by people. From technē, we derived “technology.” Over time, however, we traded clarity for cachet…(More)”.
Article by Mary Harrington: “When I was a kid in the 1980s, my parents sent me to a Waldorf school in England. At the time, the school discouraged parents from allowing their kids to watch too much TV, instead telling them to emphasize reading, hands-on learning and outdoor play.
I chafed at the stricture then. But perhaps they were on to something: Today I don’t watch much TV and I still read a lot. Since my school days, however, a far more insidious and enticing form of tech has taken hold: the internet, especially via smartphones. These days I know I have to put my phone in a drawer or in another room if I need to concentrate for more than a few minutes.
Since so-called intelligence tests were invented around a century ago, until recently, international I.Q. scores climbed steadily in a phenomenon known as the Flynn effect. But there is evidence that our ability to apply that brain power is decreasing. According to a recent report, adult literacy scores leveled off and began to decline across a majority of O.E.C.D. countries in the past decade, with some of the sharpest declines visible among the poorest. Kids also show declining literacy.
Writing in The Financial Times, John Burn-Murdoch links this to the rise of a post-literate culture in which we consume most of our media through smartphones, eschewing dense text in favor of images and short-form video. Other research has associated smartphone use with A.D.H.D. symptoms in adolescents, and a quarter of surveyed American adults now suspect they may have the condition. School and college teachers assign fewer full books to their students, in part because they are unable to complete them. Nearly half of Americans read zero books in 2023.
The idea that technology is altering our capacity not just to concentrate but also to read and to reason is catching on. The conversation no one is ready for, though, is how this may be creating yet another form of inequality…(More)”.
Article by Maha Hosain Aziz: “Traditional foreign aid is losing steam. Budget constraints, donor fatigue and nationalist politics have eroded the once-dominant western development model. But as governments pull back, a new actor has stepped in. Artificial intelligence is being deployed with a speed and reach that traditional organisations struggle to match. Code — not cash — is the new foreign aid.
Across the global south, AI is already doing some of the work that aid agencies once dominated. Ubenwa’s neonatal diagnostic app in Nigeria, Somanasi’s AI tutor in Kenya and Hello Tractor’s AI-enabled fleet management for small farmers are delivering essential services where public institutions are overstretched or absent.
Who is delivering this AI-powered development? It’s not the World Bank or USAID. Instead, tech companies like OpenAI, Google, Microsoft and Nvidia, alongside local civic-tech innovators, are stepping forward.
Consider what has already been rolled out. In the past year OpenAI has partnered with a primary care provider in Kenya to support local AI development in healthcare. In South Africa, billionaire Strive Masiyiwa worked with Nvidia to launch the continent’s first “AI factory” — a Johannesburg-based hub designed to train local talent and build regionally relevant models. In Kenya and Ghana, Google is investing in AI research centres. These projects are not labelled as foreign aid, but they’re delivering infrastructure, skills, and tools in exactly the areas where traditional donors have pulled back.
This work isn’t altruism, it’s strategy. The Trump administration’s recently released AI Action Plan makes the point explicit: AI is now a core pillar of foreign policy. The plan outlines a bold objective — exporting “the full AI stack” (from chips to models to standards) to build alliances, spread American values and counter Chinese influence in emerging markets.
But those values are not always clear — or universally shared. Alongside the push to expand access to “responsible AI,” US policymakers are backing efforts to remove what some see as “woke” elements from AI models — curbing progressive language on race, gender and history…(More)”.
Book by Kieron O’Hara: “Likening contemporary extremes of far-right populism and identity politics to 17th century Peasants and Puritans, Blockchain Politics examines the enduring importance of trust in political life. Kieron O’Hara develops a new theory of trust to analyse how these extremes undermine social accord and weaken representative democracy, and to suggest remedies.
Outlining a novel and insightful theory of trust as the basis of community relations and political institutions, the book describes in detail how the shift towards individualism in liberal democracies frames trust as a vulnerability, taking inspiration from new technologies such as blockchain and smart contracts to implement ‘trustless trust’. O’Hara demonstrates that, on the contrary, conservative measures are needed to preserve and protect liberal societies from the excesses of modern liberalism, progressivism and identity politics. He illustrates the importance of trust in responding effectively to climate change, geopolitical uncertainty and ageing populations, and argues that the solution to such serious political issues lies in rival parties accepting the positive characteristics of modern democracies, and committing to sustaining them…(More)”.
Paper by Xin Lu et al: “Human mobility forms the backbone of contact patterns through which infectious diseases propagate, fundamentally shaping the spatio-temporal dynamics of epidemics and pandemics. While traditional models are often based on the assumption that all individuals have the same probability of infecting every other individual in the population, a so-called random homogeneous mixing, they struggle to capture the complex and heterogeneous nature of real-world human interactions. Recent advancements in data-driven methodologies and computational capabilities have unlocked the potential of integrating high-resolution human mobility data into epidemic modeling, significantly improving the accuracy, timeliness, and applicability of epidemic risk assessment, contact tracing, and intervention strategies. This review provides a comprehensive synthesis of the current landscape in human mobility-informed epidemic modeling. We explore diverse sources and representations of human mobility data, and then examine the behavioral and structural roles of mobility and contact in shaping disease transmission dynamics. Furthermore, the review spans a wide range of epidemic modeling approaches, ranging from classical compartmental models to network-based, agent-based, and machine learning models. And we also discuss how mobility integration enhances risk management and response strategies during epidemics. By synthesizing these insights, the review can serve as a foundational resource for researchers and practitioners, bridging the gap between epidemiological theory and the dynamic complexities of human interaction while charting clear directions for future research…(More)”.
Article by Stefaan Verhulst: “Europe is facing a defining moment in its approach to science, research and innovation. As geopolitical tensions mount and investment in dual-use technologies surges, the EU is being called to reimagine its research policy – not just for strategic autonomy but for lasting societal relevance and real global impact.
At a recent CEPS dialogue on ‘Reimagining EU Research and Innovation Policy,’ this author focused on five asymmetries that policymakers absolutely must address if Europe is to avoid a future of diminished influence, declining trust and squandered opportunity. And the best way to avoid such a future is to build a truly (open) Science Stack.
Data asymmetry – or ‘Winter is Coming’
The foundation of modern research – especially in the AI age – is data. Yet access to high-quality, dynamic datasets remains highly concentrated among a few private actors.
Despite years of rhetorical support for data sharing, there’s been little progress made to foster systematic, sustainable and responsible data reuse. Without real incentives for data collaboration and investment in data stewardship, Europe risks entering a ‘data winter,’ where researchers and innovators are unable to access the very resources needed to compete or contribute meaningfully.
A Data Commons approach – governed by clear purpose, ethical principles and structured collaboration mechanisms – isn’t a luxury. It’s an existential necessity…(More)”