These Words Are Disappearing in the New Trump Administration


Article by Karen Yourish et al: “As President Trump seeks to purge the federal government of “woke” initiatives, agencies have flagged hundreds of words to limit or avoid, according to a compilation of government documents.

The above terms appeared in government memos, in official and unofficial agency guidance and in other documents viewed by The New York Times. Some ordered the removal of these words from public-facing websites, or ordered the elimination of other materials (including school curricula) in which they might be included.

In other cases, federal agency managers advised caution in the terms’ usage without instituting an outright ban. Additionally, the presence of some terms was used to automatically flag for review some grant proposals and contracts that could conflict with Mr. Trump’s executive orders.

The list is most likely incomplete. More agency memos may exist than those seen by New York Times reporters, and some directives are vague or suggest what language might be impermissible without flatly stating it.

All presidential administrations change the language used in official communications to reflect their own policies. It is within their prerogative, as are amendments to or the removal of web pages, which The Times has found has already happened thousands of times in this administration…(More)”

How to Win a War Against Reality


Review by Abby Smith Rumsey: “How does a democracy work if its citizens do not have a shared sense of reality? Not very well. A country whose people cannot agree on where they stand now will not agree on where they are going. This is where Americans find themselves in 2025, and they did not arrive at this juncture yesterday. The deep divisions that exist have grown over the decades, dating at least to the end of the Cold War in 1991, and are now metastasizing at an alarming rate. These divisions have many causes, from climate change to COVID-19, unchecked migration to growing wealth inequality, and other factors. People who live with chronic division and uncertainty are vulnerable. It may not take much to get them to sign on to a politics of certainty…

Take the United States. By this fractured logic, Make America Great Again (MAGA) means that America once was great, is no longer, but can be restored to its prelapsarian state, when whites sat firmly at the top of the ethnic hierarchy that constitutes the United States. Jason Stanley, a professor of philosophy and self-identified liberal, is deeply troubled that many liberal democracies across the globe are morphing into illiberal democracies before our very eyes. In “Erasing History: How Fascists Rewrite the Past to Control the Future,” he argues that all authoritarian regimes know the value of a unified, if largely mythologized, view of past, present, and future. He wrote his book to warn us that we in the United States are on the cusp of becoming an authoritarian nation or, in Stanley’s account, fascist. By explaining “the mechanisms by which democracy is attacked, the ways myths and lies are used to justify actions such as wars, and scapegoating of groups, we can defend against these attacks, and even reverse the tide.”…

The fabrication of the past is also the subject of Steve Benen’s book “Ministry of Truth. Democracy, Reality, and the Republicans’ War on the Recent Past.” Benen, a producer on the Rachel Maddow Show, keeps his eye tightly focused on the past decade, still fresh in the minds of readers. His account tracks closely how the Republican Party conducted “a war on the recent past.” He attempts an anatomy of a very unsettling phenomenon: the success of a gaslighting campaign Trump and his supporters perpetrated against the American public and even against fellow Republicans who are not MAGA enough for Trump…(More)”

Vetted Researcher Data Access


Coimisiún na Meán: “Article 40 of the Digital Services Act (DSA) makes provision for researchers to access data from Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs) for the purposes of studying systemic risk in the EU and assessing mitigation measures. There are two ways that researchers that are studying systemic risk in the EU can get access to data under Article 40 of the DSA. 

Non-public data, known as “vetted researcher data access”, under Article 40(4)-(11). This is a process where a researcher, who has been vetted or assessed by a Digital Services Coordinator to have met the criteria as set out in DSA Article 40(8), can request access to non-public data held by a VLOP/VLOSE. The data must be limited in scope and deemed necessary and proportionate to the purpose of the research.

Public data under Article 40(12).  This is a process where a researcher who meets the relevant criteria can apply for data access directly from a VLOP/VLOSE, for example, access to a content library or API of public posts…(More)”.

A US-run system alerts the world to famines. It’s gone dark after Trump slashed foreign aid


Article by Lauren Kent: “A vital, US-run monitoring system focused on spotting food crises before they turn into famines has gone dark after the Trump administration slashed foreign aid.

The Famine Early Warning Systems Network (FEWS NET) monitors drought, crop production, food prices and other indicators in order to forecast food insecurity in more than 30 countries…Now, its work to prevent hunger in Sudan, South Sudan, Somalia, Yemen, Ethiopia, Afghanistan and many other nations has been stopped amid the Trump administration’s effort to dismantle the US Agency for International Development (USAID).

“These are the most acutely food insecure countries around the globe,” said Tanya Boudreau, the former manager of the project.

Amid the aid freeze, FEWS NET has no funding to pay staff in Washington or those working on the ground. The website is down. And its treasure trove of data that underpinned global analysis on food security – used by researchers around the world – has been pulled offline.

FEWS NET is considered the gold-standard in the sector, and it publishes more frequent updates than other global monitoring efforts. Those frequent reports and projections are key, experts say, because food crises evolve over time, meaning early interventions save lives and save money…The team at the University of Colorado Boulder has built a model to forecast water demand in Kenya, which feeds some data into the FEWS NET project but also relies on FEWS NET data provided by other research teams.

The data is layered and complex. And scientists say pulling the data hosted by the US disrupts other research and famine-prevention work conducted by universities and governments across the globe.

“It compromises our models, and our ability to be able to provide accurate forecasts of ground water use,” Denis Muthike, a Kenyan scientist and assistant research professor at UC Boulder, told CNN, adding: “You cannot talk about food security without water security as well.”

“Imagine that that data is available to regions like Africa and has been utilized for years and years – decades – to help inform divisions that mitigate catastrophic impacts from weather and climate events, and you’re taking that away from the region,” Muthike said. He cautioned that it would take many years to build another monitoring service that could reach the same level…(More)”.

Standards


Book by Jeffrey Pomerantz and Jason Griffey: “Standards are the DNA of the built environment, encoded in nearly all objects that surround us in the modern world. In Standards, Jeffrey Pomerantz and Jason Griffey provide an essential introduction to this invisible but critical form of infrastructure—the rules and specifications that govern so many elements of the physical and digital environments, from the color of school buses to the shape of shipping containers.

In an approachable, often outright funny fashion, Pomerantz and Griffey explore the nature, function, and effect of standards in everyday life. Using examples of specific standards and contexts in which they are applied—in the realms of technology, economics, sociology, and information science—they illustrate how standards influence the development and scope, and indeed the very range of possibilities of our built and social worlds. Deeply informed and informally written, their work makes a subject generally deemed boring, complex, and fundamentally important comprehensible, clear, and downright engaging…(More)”.

Artificial intelligence for digital citizen participation: Design principles for a collective intelligence architecture


Paper by Nicolas Bono Rossello, Anthony Simonofski, and Annick Castiaux: “The challenges posed by digital citizen participation and the amount of data generated by Digital Participation Platforms (DPPs) create an ideal context for the implementation of Artificial Intelligence (AI) solutions. However, current AI solutions in DPPs focus mainly on technical challenges, often neglecting their social impact and not fully exploiting AI’s potential to empower citizens. The goal of this paper is thus to investigate how to design digital participation platforms that integrate technical AI solutions while considering the social context in which they are implemented. Using Collective Intelligence as kernel theory, and through a literature review and a focus group, we generate design principles for the development of a socio-technically aware AI architecture. These principles are then validated by experts from the field of AI and citizen participation. The principles suggest optimizing the alignment of AI solutions with project goals, ensuring their structured integration across multiple levels, enhancing transparency, monitoring AI-driven impacts, dynamically allocating AI actions, empowering users, and balancing cognitive disparities. These principles provide a theoretical basis for future AI-driven artifacts, and theories in digital citizen participation…(More)”.

Data, waves and wind to be counted in the economy


Article by Robert Cuffe: “Wind and waves are set to be included in calculations of the size of countries’ economies for the first time, as part of changes approved at the United Nations.

Assets like oilfields were already factored in under the rules – last updated in 2008.

This update aims to capture areas that have grown since then, such as the cost of using up natural resources and the value of data.

The changes come into force in 2030, and could mean an increase in estimates of the size of the UK economy making promises to spend a fixed share of the economy on defence or aid more expensive.

The economic value of wind and waves can be estimated from the price of all the energy that can be generated from the turbines in a country.

The update also treats data as an asset in its own right on top of the assets that house it like servers and cables.

Governments use a common rule book for measuring the size of their economies and how they grow over time.

These changes to the rule book are “tweaks, rather than a rewrite”, according to Prof Diane Coyle of the University of Cambridge.

Ben Zaranko of the Institute for Fiscal Studies (IFS) calls it an “accounting” change, rather than a real change. He explains: “We’d be no better off in a material sense, and tax revenues would be no higher.”

But it could make economies look bigger, creating a possible future spending headache for the UK government…(More)”.

Bridging the Data Provenance Gap Across Text, Speech and Video


Paper by Shayne Longpre et al: “Progress in AI is driven largely by the scale and quality of training data. Despite this, there is a deficit of empirical analysis examining the attributes of well-established datasets beyond text. In this work we conduct the largest and first-of-its-kind longitudinal audit across modalities–popular text, speech, and video datasets–from their detailed sourcing trends and use restrictions to their geographical and linguistic representation. Our manual analysis covers nearly 4000 public datasets between 1990-2024, spanning 608 languages, 798 sources, 659 organizations, and 67 countries. We find that multimodal machine learning applications have overwhelmingly turned to web-crawled, synthetic, and social media platforms, such as YouTube, for their training sets, eclipsing all other sources since 2019. Secondly, tracing the chain of dataset derivations we find that while less than 33% of datasets are restrictively licensed, over 80% of the source content in widely-used text, speech, and video datasets, carry non-commercial restrictions. Finally, counter to the rising number of languages and geographies represented in public AI training datasets, our audit demonstrates measures of relative geographical and multilingual representation have failed to significantly improve their coverage since 2013. We believe the breadth of our audit enables us to empirically examine trends in data sourcing, restrictions, and Western-centricity at an ecosystem-level, and that visibility into these questions are essential to progress in responsible AI. As a contribution to ongoing improvements in dataset transparency and responsible use, we release our entire multimodal audit, allowing practitioners to trace data provenance across text, speech, and video…(More)”.

Reconciling open science with technological sovereignty


Paper by C. Huang & L. Soete: “In history, open science has been effective in facilitating knowledge sharing and promoting and diffusing innovations. However, as a result of geopolitical tensions, technological sovereignty has recently been increasingly emphasized in various countries’ science and technology policy making, posing a challenge to open science policy. In this paper, we argue that the European Union significantly benefits from and contributes to open science and should continue to support it. Similarly, China embraced foreign technologies and engaged in open science as its economy developed rapidly in the last 40 years. Today both economies could learn from each other in finding the right balance between open science and technological sovereignty particularly given the very different policy experience and the urgency of implementing new technologies addressing the grand challenges such as climate change faced by mankind…(More)”.

Nurturing innovation through intelligent failure: The art of failing on purpose


Paper by Alessandro Narduzzo and Valentina Forrer: “Failure, even in the context of innovation, is primarily conceived and experienced as an inevitable (e.g., innovation funnel) or unintended (e.g., unexpected drawbacks) outcome. This paper aims to provide a more systematic understanding of innovation failure by considering and problematizing the case of “intelligent failures”, namely experiments that are intentionally designed and implemented to explore technological and market uncertainty. We conceptualize intelligent failure through an epistemic perspective that recognizes its contribution to challenging and revising the organizational knowledge system. We also outline an original process model of intelligent failure that fully reveals its potential and distinctiveness in the context of learning from failure (i.e., failure as an outcome vs failure of expectations and initial beliefs), analyzing and comparing intended and unintended innovation failures. By positioning intelligent failure in the context of innovation and explaining its critical role in enhancing the ability of innovative firms to achieve breakthroughs, we identify important landmarks for practitioners in designing an intelligent failure approach to innovation…(More)”.