The Social Biome: How Everyday Communication Connects and Shapes Us


Book by Andy J. Merolla and Jeffrey A. Hall: “We spend much of our waking lives communicating with others. How does each moment of interaction shape not only our relationships but also our worldviews? And how can we create moments of connection that improve our health and well-being, particularly in a world in which people are feeling increasingly isolated?
 
Drawing from their extensive research, Andy J. Merolla and Jeffrey A. Hall establish a new way to think about our relational life: as existing within “social biomes”—complex ecosystems of moments of interaction with others. Each interaction we have, no matter how unimportant or mundane it might seem, is a building block of our identities and beliefs. Consequently, the choices we make about how we interact and who we interact with—and whether we interact at all—matter more than we might know. Merolla and Hall offer a sympathetic, practical guide to our vital yet complicated social lives and propose realistic ways to embrace and enhance connection and hope…(More)”.

Data Localization: A Global Threat to Human Rights Online


Article by Freedom House: “From Pakistan to Zambia, governments around the world are increasingly proposing and passing data localization legislation. These laws, which refer to the rules governing the storage and transfer of electronic data across jurisdictions, are often justified as addressing concerns such as user privacy, cybersecurity, national security, and monopolistic market practices. Notwithstanding these laudable goals, data localization initiatives cause more harm than good, especially in legal environments with poor rule of law.

Data localization requirements can take many different forms. A government may require all companies collecting and processing certain types of data about local users to store the data on servers located in the country. Authorities may also restrict the foreign transfer of certain types of data or allow it only under narrow circumstances, such as after obtaining the explicit consent of users, receiving a license or permit from a public authority, or conducting a privacy assessment of the country to which the data will be transferred.

While data localization can have significant economic and security implications, the focus of this piece—inline with that of the Global Network Initiative and Freedom House—is on its potential human rights impacts, which are varied. Freedom House’s research shows that the rise in data localization policies worldwide is contributing to the global decline of internet freedom. Without robust transparency and accountability frameworks embedded into these provisions, digital rights are often put on the line. As these types of legislation continue to pop up globally, the need for rights-respecting solutions and norms for cross-border data flows is greater than ever…(More)”.

Why more AI researchers should collaborate with governments


Article by Mohamed Ibrahim: “Artificial intelligence (AI) is beginning to transform many industries, yet its use to improve public services remains limited globally. AI-based tools could streamline access to government benefits through online chatbots or automate systems by which citizens report problems such as potholes.

Currently, scholarly advances in AI are mostly confined to academic papers and conferences, rarely translating into actionable government policies or products. This means that the expertise at universities is not used to solve real-world problems. As a No10 Innovation Fellow with the UK government and a lecturer in spatial data science, I have explored the potential of AI-driven rapid prototyping in public policy.

Take Street.AI, a prototype smartphone app that I developed, which lets citizens report issues including potholes, street violence or illegal litter dumping by simply taking a picture through the app. The AI model classifies the problem automatically and alerts the relevant local authority, passing on the location and details of the issue. A key feature of the app is its on-device processing, which ensures privacy and reduces operational costs. Similar tools were tested as an early-warning system during the riots that swept the United Kingdom in July and August 2024.

AI models can also aid complex decision-making — for instance, that involved in determining where to build houses. The UK government plans to construct 1.5 million homes in the next 5 years, but planning laws require that several parameters be considered — such as proximity to schools, noise levels, the neighbourhoods’ built-up ratio and flood risk. The current strategy is to compile voluminous academic reports on viable locations, but an online dashboard powered by AI that can optimize across parameters would be much more useful to policymakers…(More)”.

Massive, Unarchivable Datasets of Cancer, Covid, and Alzheimer’s Research Could Be Lost Forever


Article by Sam Cole: “Almost two dozen repositories of research and public health data supported by the National Institutes of Health are marked for “review” under the Trump administration’s direction, and researchers and archivists say the data is at risk of being lost forever if the repositories go down. 

“The problem with archiving this data is that we can’t,” Lisa Chinn, Head of Research Data Services at the University of Chicago, told 404 Media. Unlike other government datasets or web pages, downloading or otherwise archiving NIH data often requires a Data Use Agreement between a researcher institution and the agency, and those agreements are carefully administered through a disclosure risk review process. 

A message appeared at the top of multiple NIH websites last week that says: “This repository is under review for potential modification in compliance with Administration directives.”

Repositories with the message include archives of cancer imagery, Alzheimer’s disease research, sleep studies, HIV databases, and COVID-19 vaccination and mortality data…

“So far, it seems like what is happening is less that these data sets are actively being deleted or clawed back and more that they are laying off the workers whose job is to maintain them, update them and maintain the infrastructure that supports them,” a librarian affiliated with the Data Rescue Project told 404 Media. “In time, this will have the same effect, but it’s really hard to predict. People don’t usually appreciate, much less our current administration, how much labor goes into maintaining a large research dataset.”…(More)”.

Situating Digital Self-Determination (DSD): A Comparison with Existing and Emerging Digital and Data Governance Approaches


Paper by Sara Marcucci and Stefaan Verhulst: “In today’s increasingly complex digital landscape, traditional data governance models-such as consent-based, ownership-based, and sovereignty-based approaches-are proving insufficient to address the evolving ethical, social, and political dimensions of data use. These frameworks, often grounded in static and individualistic notions of control, struggle to keep pace with the fluidity and relational nature of contemporary data ecosystems. This paper proposes Digital Self-Determination (DSD) as a complementary and necessary evolution of existing models, offering a more participatory, adaptive, and ethically grounded approach to data governance. Centering ongoing agency, collective participation, and contextual responsiveness, DSD builds on foundational principles of consent and control while addressing their limitations. Drawing on comparisons with a range of governance models-including risk-based, compliance-oriented, principles-driven, and justice-centered frameworks-this paper highlights DSD’s unique contribution: its capacity to enable individuals and communities to actively shape how data about them is used, shared, and governed over time. In doing so, it reimagines data governance as a living, co-constructed practice grounded in trust, accountability, and care. Through this lens, the paper offers a framework for comparing different governance approaches and embedding DSD into existing paradigms, inviting policymakers and practitioners to consider how more inclusive and responsive forms of digital governance might be realized…(More)”.

Digital Technologies and Participatory Governance in Local Settings: Comparing Digital Civic Engagement Initiatives During the COVID-19 Outbreak


Chapter by Nathalie Colasanti, Chiara Fantauzzi, Rocco Frondizi & Noemi Rossi: “Governance paradigms have undergone a deep transformation during the COVID-19 pandemic, necessitating agile, inclusive, and responsive mechanisms to address evolving challenges. Participatory governance has emerged as a guiding principle, emphasizing inclusive decision-making processes and collaboration among diverse stakeholders. In the outbreak context, digital technologies have played a crucial role in enabling participatory governance to flourish, democratizing participation, and facilitating the rapid dissemination of accurate information. These technologies have also empowered grassroots initiatives, such as civic hacking, to address societal challenges and mobilize communities for collective action. This study delves into the realm of bottom-up participatory initiatives at the local level, focusing on two emblematic cases of civic hacking experiences launched during the pandemic, the first in Wuhan, China, and the second in Italy. Through a comparative lens, drawing upon secondary sources, the aim is to analyze the dynamics, efficacy, and implications of these initiatives, shedding light on the evolving landscape of participatory governance in times of crisis. Findings underline the transformative potential of civic hacking and participatory governance in crisis response, highlighting the importance of collaboration, transparency, and inclusivity…(More)”.

DOGE comes for the data wonks


The Economist: “For nearly three decades the federal government has painstakingly surveyed tens of thousands of Americans each year about their health. Door-knockers collect data on the financial toll of chronic conditions like obesity and asthma, and probe the exact doses of medications sufferers take. The result, known as the Medical Expenditure Panel Survey (MEPS), is the single most comprehensive, nationally representative portrait of American health care, a balkanised and unwieldy $5trn industry that accounts for some 17% of GDP.

MEPS is part of a largely hidden infrastructure of government statistics collection now in the crosshairs of the Department of Government Efficiency (DOGE). In mid-March officials at a unit of the Department of Health and Human Services (HHS) that runs the survey told employees that DOGE had slated them for an 80-90% reduction in staff and that this would “not be a negotiation”. Since then scores of researchers have taken voluntary buyouts. Those left behind worry about the integrity of MEPS. “Very unclear whether or how we can put on MEPS” with roughly half of the staff leaving, one said. On March 27th, the health secretary, Robert F. Kennedy junior, announced an overall reduction of 10,000 personnel at the department, in addition to those who took buyouts.

There are scores of underpublicised government surveys like MEPS that document trends in everything from house prices to the amount of lead in people’s blood. Many provide standard-setting datasets and insights into the world’s largest economy that the private sector has no incentive to replicate.

Even so, America’s system of statistics research is overly analogue and needs modernising. “Using surveys as the main source of information is just not working” because it is too slow and suffers from declining rates of participation, says Julia Lane, an economist at New York University. In a world where the economy shifts by the day, the lags in traditional surveys—whose results can take weeks or even years to refine and publish—are unsatisfactory. One practical reform DOGE might encourage is better integration of administrative data such as tax records and social-security filings which often capture the entire population and are collected as a matter of course.

As in so many other areas, however, DOGE’s sledgehammer is more likely to cause harm than to achieve improvements. And for all its clunkiness, America’s current system manages a spectacular feat. From Inuits in remote corners of Alaska to Spanish-speakers in the Bronx, it measures the country and its inhabitants remarkably well, given that the population is highly diverse and spread out over 4m square miles. Each month surveys from the federal government reach about 1.5m people, a number roughly equivalent to the population of Hawaii or West Virginia…(More)”.

How governments can move beyond bureaucracy


Interview with Jorrit de Jong: “..Bureaucracy is not so much a system of rules, it is a system of values. It is an organizational form that governs how work gets done in accordance with principles that the sociologist Max Weber first codified: standardization, formalization, expert officialdom, specialization, hierarchy, and accountability. Add those up and you arrive at a system that values the written word; that is siloed because that’s what specialization does; that can sometimes be slow because there is a chain of command and an approval process. Standardization supports the value that it doesn’t matter who you are, who you know, what you look like when you’re applying for a permit, or who is issuing the permit: the case will be evaluated based on its merits. That is a good thing. Bureaucracy is a way to do business in a rational, impersonal, responsible and efficient way, at least in theory

It becomes a problem when organizations start to violate their own values and lose connection with their purpose. If standardization turns into rigidity, doing justice to extenuating individual circumstances becomes hard. If formalization becomes pointless paper pushing, it defeats the purpose. And if accountability structures favor risk aversion over taking initiative, organizations can’t innovate.

Bureaucratic dysfunction occurs when the system that we’ve created ceases to produce the value that we wanted out of it. But that does not mean we have to throw away the baby with the bathwater. Can we create organizations that have the benefits of accountability, standardization and specialization without the burdens of slowness, rigidity, and silos? My answer is yes. Research we did with the Bloomberg Harvard City Leadership Initiative shows how organizations can improve performance by building capabilities that make them more nimble, responsive, and user-friendly. Cities that leverage data to better understand the communities they serve and measure performance learn and improve faster. Cities that use design thinking to reinvent resident services save time and money. And cities that collaborate across organizational and sector boundaries come up with more effective solutions to urban problems…(More)”

Researching data discomfort: The case of Statistics Norway’s quest for billing data


Paper by Lisa Reutter: “National statistics offices are increasingly exploring the possibilities of utilizing new data sources to position themselves in emerging data markets. In 2022, Statistics Norway announced that the national agency will require the biggest grocers in Norway to hand over all collected billing data to produce consumer behavior statistics which had previously been produced by other sampling methods. An online article discussing this proposal sparked a surprisingly (at least to Statistics Norway) high level of interest among readers, many of whom expressed concerns about this intended change in data practice. This paper focuses on the multifaceted online discussions of the proposal, as these enable us to study citizens’ reactions and feelings towards increased data collection and emerging public-private data flows in a Nordic context. Through an explorative empirical analysis of comment sections, this paper investigates what is discussed by commenters and reflects upon why this case sparked so much interest among citizens in the first place. It therefore contributes to the growing literature of citizens’ voices in data-driven administration and to a wider discussion on how to research public feeling towards datafication. I argue that this presents an interesting case of discomfort voiced by citizens, which demonstrates the contested nature of data practices among citizens–and their ability to regard data as deeply intertwined with power and politics. This case also reminds researchers to pay attention to seemingly benign and small changes in administration beyond artificial intelligence…(More)”

Which Data Do Economists Use to Study Corruption ?


World Bank paper: “…examines the data sources and methodologies used in economic research on corruption by analyzing 339 journal articles published in 2022 that include Journal of Economic Literature codes. The paper identifies the most commonly used data types, sources, and geographical foci, as well as whether studies primarily investigate the causes or consequences of corruption. Cross-country composite indicators remain the dominant measure, while single country studies more frequently utilize administrative data. Articles in ranked journals are more likely to employ administrative and experimental data and focus on the causes of corruption. The broader dataset of 882 articles highlights the significant academic interest in corruption across disciplines, particularly in political science and public policy. The findings raise concerns about the limited use of novel data sources and the relative neglect of research on the causes of corruption, underscoring the need for a more integrated approach within the field of economics…(More)”.