How governments can move beyond bureaucracy


Interview with Jorrit de Jong: “..Bureaucracy is not so much a system of rules, it is a system of values. It is an organizational form that governs how work gets done in accordance with principles that the sociologist Max Weber first codified: standardization, formalization, expert officialdom, specialization, hierarchy, and accountability. Add those up and you arrive at a system that values the written word; that is siloed because that’s what specialization does; that can sometimes be slow because there is a chain of command and an approval process. Standardization supports the value that it doesn’t matter who you are, who you know, what you look like when you’re applying for a permit, or who is issuing the permit: the case will be evaluated based on its merits. That is a good thing. Bureaucracy is a way to do business in a rational, impersonal, responsible and efficient way, at least in theory

It becomes a problem when organizations start to violate their own values and lose connection with their purpose. If standardization turns into rigidity, doing justice to extenuating individual circumstances becomes hard. If formalization becomes pointless paper pushing, it defeats the purpose. And if accountability structures favor risk aversion over taking initiative, organizations can’t innovate.

Bureaucratic dysfunction occurs when the system that we’ve created ceases to produce the value that we wanted out of it. But that does not mean we have to throw away the baby with the bathwater. Can we create organizations that have the benefits of accountability, standardization and specialization without the burdens of slowness, rigidity, and silos? My answer is yes. Research we did with the Bloomberg Harvard City Leadership Initiative shows how organizations can improve performance by building capabilities that make them more nimble, responsive, and user-friendly. Cities that leverage data to better understand the communities they serve and measure performance learn and improve faster. Cities that use design thinking to reinvent resident services save time and money. And cities that collaborate across organizational and sector boundaries come up with more effective solutions to urban problems…(More)”

What Autocrats Want From Academics: Servility


Essay by Anna Dumont: “Since Trump’s inauguration, the university community has received a good deal of “messaging” from academic leadership. We’ve received emails from our deans and university presidents; we’ve sat in department meetings regarding the “developing situation”; and we’ve seen the occasional official statement or op-ed or comment in the local newspaper. And the unfortunate takeaway from all this is that our leaders’ strategy rests on a disturbing and arbitrary distinction. The public-facing language of the university — mission statements, programming, administrative structures, and so on — has nothing at all to do with the autonomy of our teaching and research, which, they assure us, they hold sacrosanct. Recent concessions — say, the disappearance of the website of the Women’s Center — are concerning, they admit, but ultimately inconsequential to our overall working lives as students and scholars.

History, however, shows that public-facing statements are deeply consequential, and one episode from the 20-year march of Italian fascism strikes me as especially instructive. On October 8, 1931, a law went into effect requiring, as a condition of their employment, every Italian university professor to sign an oath pledging their loyalty to the government of Benito Mussolini. Out of over 1,200 professors in the country, only 12 refused.

Today, those who refused are known simply as “I Dodici”: the Twelve. They were a scholar of Middle Eastern languages, an organic chemist, a doctor of forensic medicine, three lawyers, a mathematician, a theologian, a surgeon, a historian of ancient Rome, a philosopher of Kantian ethics, and one art historian. Two, Francesco Ruffini and Edoardo Ruffini Avondo, were father and son. Four were Jewish. All of them were immediately fired…(More)”

Global population data is in crisis – here’s why that matters


Article by Andrew J Tatem and Jessica Espey: “Every day, decisions that affect our lives depend on knowing how many people live where. For example, how many vaccines are needed in a community, where polling stations should be placed for elections or who might be in danger as a hurricane approaches. The answers rely on population data.

But counting people is getting harder.

For centuries, census and household surveys have been the backbone of population knowledge. But we’ve just returned from the UN’s statistical commission meetings in New York, where experts reported that something alarming is happening to population data systems globally.

Census response rates are declining in many countries, resulting in large margins of error. The 2020 US census undercounted America’s Latino population by more than three times the rate of the 2010 census. In Paraguay, the latest census revealed a population one-fifth smaller than previously thought.

South Africa’s 2022 census post-enumeration survey revealed a likely undercount of more than 30%. According to the UN Economic Commission for Africa, undercounts and census delays due to COVID-19, conflict or financial limitations have resulted in an estimated one in three Africans not being counted in the 2020 census round.

When people vanish from data, they vanish from policy. When certain groups are systematically undercounted – often minorities, rural communities or poorer people – they become invisible to policymakers. This translates directly into political underrepresentation and inadequate resource allocation…(More)”.

How social media and online communities influence climate change beliefs


Article by James Rice: “Psychological, social, and political forces all shape beliefs about climate change. Climate scientists bear a responsibility — not only as researchers and educators, but as public communicators — to guard against climate misinformation. This responsibility should be foundational, supported by economists, sociologists, and industry leaders.

While fake news manifests in various forms, not all forms of misinformation are created with the intent to deceive. Regardless of intent, climate misinformation threatens policy integrity. Strengthening environmental communication is thus crucial to counteract ideological divides that distort scientific discourse and weaken public trust.

Political polarisation, misinformation, and the erosion of scientific authority pose challenges demanding rigorous scholarship and proactive public engagement. Climate scientists, policymakers, and climate justice advocates must ensure scientific integrity while recognising that climate science operates in a politically charged landscape. Agnosticism and resignation, rather than resisting climate misinformation, are as dangerous as outright denial of climate science. Combating this extends beyond scientific accuracy. It requires strategic communication, engagement with advocacy groups, and the reinforcement of public trust in environmental expertise…(More)”.

Trump Admin Plans to Cut Team Responsible for Critical Atomic Measurement Data


Article by Louise Matsakis and Will Knight: “The US National Institute of Standards and Technology (NIST) is discussing plans to eliminate an entire team responsible for publishing and maintaining critical atomic measurement data in the coming weeks, as the Trump administration continues its efforts to reduce the US federal workforce, according to a March 18 email sent to dozens of outside scientists. The data in question underpins advanced scientific research around the world in areas like semiconductor manufacturing and nuclear fusion…(More)”.

Bubble Trouble


Article by Bryan McMahon: “…Venture capital (VC) funds, drunk on a decade of “growth at all costs,” have poured about $200 billion into generative AI. Making matters worse, the stock market’s bull run is deeply dependent on the growth of the Big Tech companies fueling the AI bubble. In 2023, 71 percent of the total gains in the S&P 500 were attributable to the “Magnificent Seven”—Apple, Nvidia, Tesla, Alphabet, Meta, Amazon, and Microsoft—all of which are among the biggest spenders on AI. Just four—Microsoft, Alphabet, Amazon, and Meta—combined for $246 billion of capital expenditure in 2024 to support the AI build-out. Goldman Sachs expects Big Tech to spend over $1 trillion on chips and data centers to power AI over the next five years. Yet OpenAI, the current market leader, expects to lose $5 billion this year, and its annual losses to swell to $11 billion by 2026. If the AI bubble bursts, it not only threatens to wipe out VC firms in the Valley but also blow a gaping hole in the public markets and cause an economy-wide meltdown…(More)”.

Integrating Social Media into Biodiversity Databases: The Next Big Step?


Article by Muhammad Osama: “Digital technologies and social media have transformed ecology and conservation biology data collection. Traditional biodiversity monitoring often relies on field surveys, which can be time-consuming and biased toward rural habitats.

The Global Biodiversity Information Facility (GBIF) serves as a key repository for biodiversity data, but it faces challenges such as delayed data availability and underrepresentation of urban habitats.

Social media platforms have become valuable tools for rapid data collection, enabling users to share georeferenced observations instantly, reducing time lags associated with traditional methods. The widespread use of smartphones with cameras allows individuals to document wildlife sightings in real-time, enhancing biodiversity monitoring. Integrating social media data with traditional ecological datasets offers significant advancements, particularly in tracking species distributions in urban areas.

In this paper, the authors evaluated the Jersey tiger moth’s habitat usage by comparing occurrence data from social media platforms (Instagram and Flickr) with traditional records from GBIF and iNaturalist. They hypothesized that social media data would reveal significant JTM occurrences in urban environments, which may be underrepresented in traditional datasets…(More)”.

Panels giving scientific advice to Census Bureau disbanded by Trump administration


Article by Jeffrey Mervis: “…U.S. Secretary of Commerce Howard Lutnick has disbanded five outside panels that provide scientific and community advice to the U.S. Census Bureau and other federal statistical agencies just as preparations are ramping up for the country’s next decennial census, in 2030.

The dozens of demographers, statisticians, and public members on the five panels received nearly identical letters this week telling them that “the Secretary of Commerce has determined that the purposes for which the [committee] was established have been fulfilled, and the committee has been terminated effective February 28, 2025. Thank you for your service.”

Statistician Robert Santos, who last month resigned as Census Bureau director 3 years into his 5-year term, says he’s “terribly disappointed but not surprised” by the move, noting how a recent directive by President Donald Trump on gender identity has disrupted data collection for a host of federal surveys…(More)”.

Government data is disappearing before our eyes


Article by Anna Massoglia: “A battle is being waged in the quiet corners of government websites and data repositories. Essential public records are disappearing and, with them, Americans’ ability to hold those in power accountable.

Take the Department of Government Efficiency, Elon Musk’s federal cost-cutting initiative. Touted as “maximally transparent,” DOGE is supposed to make government spending more efficient. But when journalists and researchers exposed major errors — from double-counting contracts to conflating caps with actual spending — DOGE didn’t fix the mistakes. Instead, it made them harder to detect.

Many Americans hoped DOGE’s work would be a step toward cutting costs and restoring trust in government. But trust must be earned. If our leaders truly want to restore faith in our institutions, they must ensure that facts remain available to everyone, not just when convenient.

Since Jan. 20, public records across the federal government have been erased. Economic indicators that guide investments, scientific datasets that drive medical breakthroughs, federal health guidelines and historical archives that inform policy decisions have all been put on the chopping block. Some missing datasets have been restored but are incomplete or have unexplained changes, rendering them unreliable.

Both Republican and Democratic administrations have played a role in limiting public access to government records. But the scale and speed of the Trump administration’s data manipulation — combined with buyouts, resignations and other restructuring across federal agencies — signal a new phase in the war on public information. This is not just about deleting files, it’s about controlling what the public sees, shaping the narrative and limiting accountability.

The Trump administration is accelerating this trend with revisions to official records. Unelected advisors are overseeing a sweeping reorganization of federal data, granting entities like DOGE unprecedented access to taxpayer records with little oversight. This is not just a bureaucratic reshuffle — it is a fundamental reshaping of the public record.

The consequences of data manipulation extend far beyond politics. When those in power control the flow of information, they can dictate collective truth. Governments that manipulate information are not just rewriting statistics — they are rewriting history.

From authoritarian regimes that have erased dissent to leaders who have fabricated economic numbers to maintain their grip on power, the dangers of suppressing and distorting data are well-documented.

Misleading or inconsistent data can be just as dangerous as opacity. When hard facts are replaced with political spin, conspiracy theories take root and misinformation fills the void.

The fact that data suppression and manipulation has occurred before does not lessen the danger, but underscores the urgency of taking proactive measures to safeguard transparency. A missing statistic today can become a missing historical fact tomorrow. Over time, that can reshape our reality…(More)”.

Can small language models revitalize Indigenous languages?


Article by Brooke Tanner and Cameron F. Kerry: “Indigenous languages play a critical role in preserving cultural identity and transmitting unique worldviews, traditions, and knowledge, but at least 40% of the world’s 6,700 languages are currently endangered. The United Nations declared 2022-2032 as the International Decade of Indigenous Languages to draw attention to this threat, in hopes of supporting the revitalization of these languages and preservation of access to linguistic resources.  

Building on the advantages of SLMs, several initiatives have successfully adapted these models specifically for Indigenous languages. Such Indigenous language models (ILMs) represent a subset of SLMs that are designed, trained, and fine-tuned with input from the communities they serve. 

Case studies and applications 

  • Meta released No Language Left Behind (NLLB-200), a 54 billion–parameter open-source machine translation model that supports 200 languages as part of Meta’s universal speech translator project. The model includes support for languages with limited translation resources. While the model’s breadth of languages included is novel, NLLB-200 can struggle to capture the intricacies of local context for low-resource languages and often relies on machine-translated sentence pairs across the internet due to the scarcity of digitized monolingual data. 
  • Lelapa AI’s InkubaLM-0.4B is an SLM with applications for low-resource African languages. Trained on 1.9 billion tokens across languages including isiZulu, Yoruba, Swahili, and isiXhosa, InkubaLM-0.4B (with 400 million parameters) builds on Meta’s LLaMA 2 architecture, providing a smaller model than the original LLaMA 2 pretrained model with 7 billion parameters. 
  • IBM Research Brazil and the University of São Paulo have collaborated on projects aimed at preserving Brazilian Indigenous languages such as Guarani Mbya and Nheengatu. These initiatives emphasize co-creation with Indigenous communities and address concerns about cultural exposure and language ownership. Initial efforts included electronic dictionaries, word prediction, and basic translation tools. Notably, when a prototype writing assistant for Guarani Mbya raised concerns about exposing their language and culture online, project leaders paused further development pending community consensus.  
  • Researchers have fine-tuned pre-trained models for Nheengatu using linguistic educational sources and translations of the Bible, with plans to incorporate community-guided spellcheck tools. Since the translations relying on data from the Bible, primarily translated by colonial priests, often sounded archaic and could reflect cultural abuse and violence, they were classified as potentially “toxic” data that would not be used in any deployed system without explicit Indigenous community agreement…(More)”.