Article by Yaqub Chaudhary and Jonnie Penn: “The rapid proliferation of large language models (LLMs) invites the possibility of a new marketplace for behavioral and psychological data that signals intent. This brief article introduces some initial features of that emerging marketplace. We survey recent efforts by tech executives to position the capture, manipulation, and commodification of human intentionality as a lucrative parallel to—and viable extension of—the now-dominant attention economy, which has bent consumer, civic, and media norms around users’ finite attention spans since the 1990s. We call this follow-on the intention economy. We characterize it in two ways. First, as a competition, initially, between established tech players armed with the infrastructural and data capacities needed to vie for first-mover advantage on a new frontier of persuasive technologies. Second, as a commodification of hitherto unreachable levels of explicit and implicit data that signal intent, namely those signals borne of combining (a) hyper-personalized manipulation via LLM-based sycophancy, ingratiation, and emotional infiltration and (b) increasingly detailed categorization of online activity elicited through natural language.
This new dimension of automated persuasion draws on the unique capabilities of LLMs and generative AI more broadly, which intervene not only on what users want, but also, to cite Williams, “what they want to want” (Williams, 2018, p. 122). We demonstrate through a close reading of recent technical and critical literature (including unpublished papers from ArXiv) that such tools are already being explored to elicit, infer, collect, record, understand, forecast, and ultimately manipulate, modulate, and commodify human plans and purposes, both mundane (e.g., selecting a hotel) and profound (e.g., selecting a political candidate)…(More)”.
State of Digital Local Government
Report by the Local Government Association (UK): “This report is themed around four inter-related areas on the state of local government digital: market concentration, service delivery, technology, and delivery capabilities. It is particularly challenging to assess the current state of digital transformation in local government, given the diversity of experience, resources and lack of consistent data collection on digital transformation and technology estates.
This report is informed through our regular and extensive engagement with local government, primary research carried out by the LGA, and the research of stakeholders. It is worth noting that research on market concentration is challenging as it is a highly sensitive area.
Key messages:
- Local Government is a vital part of the public sector innovation ecosystem. Local government needs their priorities and context to be understood within cross public sector digital transformation ambitions through representation on public sector strategic boards and subsequently integrated into the design of public sector guidance and cross-government products at the earliest point. This will reduce the likelihood of duplication at public expense. Local government must also have equivalent access to training as civil servants…(More)”.
Good government data requires good statistics officials – but how motivated and competent are they?
Worldbank Blog: “Government data is only as reliable as the statistics officials who produce it. Yet, surprisingly little is known about these officials themselves. For decades, they have diligently collected data on others – such as households and firms – to generate official statistics, from poverty rates to inflation figures. Yet, data about statistics officials themselves is missing. How competent are they at analyzing statistical data? How motivated are they to excel in their roles? Do they uphold integrity when producing official statistics, even in the face of opposing career incentives or political pressures? And what can National Statistical Offices (NSOs) do to cultivate a workforce that is competent, motivated, and ethical?
We surveyed 13,300 statistics officials in 14 countries in Latin America and the Caribbean to find out. Five results stand out. For further insights, consult our Inter-American Development Bank (IDB) report, Making National Statistical Offices Work Better.
1. The competence and management of statistics officials shape the quality of statistical data
Our survey included a short exam assessing basic statistical competencies, such as descriptive statistics and probability. Statistical competence correlates with data quality: NSOs with higher exam scores among employees tend to achieve better results in the World Bank’s Statistical Performance Indicators (r = 0.36).
NSOs with better management practices also have better statistical performance. For instance, NSOs with more robust recruitment and selection processes have better statistical performance (r = 0.62)…(More)”.
How and When to Involve Crowds in Scientific Research
Book by Marion K. Poetz and Henry Sauermann: “This book explores how millions of people can significantly contribute to scientific research with their effort and experience, even if they are not working at scientific institutions and may not have formal scientific training.
Drawing on a strong foundation of scholarship on crowd involvement, this book helps researchers recognize and understand the benefits and challenges of crowd involvement across key stages of the scientific process. Designed as a practical toolkit, it enables scientists to critically assess the potential of crowd participation, determine when it can be most effective, and implement it to achieve meaningful scientific and societal outcomes.
The book also discusses how recent developments in artificial intelligence (AI) shape the role of crowds in scientific research and can enhance the effectiveness of crowd science projects…(More)”
Nearly all Americans use AI, though most dislike it, poll shows
Axios: “The vast majority of Americans use products that involve AI, but their views of the technology remain overwhelmingly negative, according to a Gallup-Telescope survey published Wednesday.
Why it matters: The rapid advancement of generative AI threatens to have far-reaching consequences for Americans’ everyday lives, including reshaping the job market, impacting elections, and affecting the health care industry.
The big picture: An estimated 99% of Americans used at least one AI-enabled product in the past week, but nearly two-thirds didn’t realize they were doing so, according to the poll’s findings.
- These products included navigation apps, personal virtual assistants, weather forecasting apps, streaming services, shopping websites and social media platforms.
- Ellyn Maese, a senior research consultant at Gallup, told Axios that the disconnect is because there is “a lot of confusion when it comes to what is just a computer program versus what is truly AI and intelligent.”
Zoom in: Despite its prevalent use, Americans’ views of AI remain overwhelmingly bleak, the survey found.
- 72% of those surveyed had a “somewhat” or “very” negative opinion of how AI would impact the spread of false information, while 64% said the same about how it affects social connections.
- The only area where a majority of Americans (61%) had a positive view of AI’s impact was regarding how it might help medical diagnosis and treatment…
State of play: The survey found that 68% of Americans believe the government and businesses equally bear responsibility for addressing the spread of false information related to AI.
- 63% said the same about personal data privacy violations.
- Majorities of those surveyed felt the same about combatting the unauthorized use of individuals’ likenesses (62%) and AI’s impact on job losses (52%).
- In fact, the only area where Americans felt differently was when it came to national security threats; 62% of those surveyed said the government bore primary responsibility for reducing such threats…(More).”
Governing artificial intelligence means governing data: (re)setting the agenda for data justice
Paper by Linnet Taylor, Siddharth Peter de Souza, Aaron Martin, and Joan López Solano: “The field of data justice has been evolving to take into account the role of data in powering the field of artificial intelligence (AI). In this paper we review the main conceptual bases for governing data and AI: the market-based approach, the personal–non-personal data distinction and strategic sovereignty. We then analyse how these are being operationalised into practical models for governance, including public data trusts, data cooperatives, personal data sovereignty, data collaboratives, data commons approaches and indigenous data sovereignty. We interrogate these models’ potential for just governance based on four benchmarks which we propose as a reformulation of the Data Justice governance agenda identified by Taylor in her 2017 framework. Re-situating data justice at the intersection of data and AI, these benchmarks focus on preserving and strengthening public infrastructures and public goods; inclusiveness; contestability and accountability; and global responsibility. We demonstrate how they can be used to test whether a governance approach will succeed in redistributing power, engaging with public concerns and creating a plural politics of AI…(More)”.
Local Government and Citizen Co-production Through Neighborhood Associations
Chapter by Kohei Suzuki: “This chapter explores co-production practices of local governments, focusing on the role of neighborhood associations (NHAs) in Japan. This chapter investigates the overall research question of this book: how to enhance government performance under resource constraints, by focusing on the concept of citizen co-production. Historically, NHAs have played a significant role in supplementing municipal service provisions as co-producers for local governments. Despite the rich history of NHAs and their contributions to public service delivery at the municipal level, theoretical and empirical studies on NHAs and co-production practices remain limited. This chapter aims to address this research gap by exploring the following research questions: What are NHAs from a perspective of citizen co-production? What are the potential contributions of studying NHAs to the broader theory of co-production? What are the future research agendas? The chapter provides an overview of the origin and evolution of the co-production concept. It then examines the main characteristics and activities of NHAs and discusses their roles in supplementing local public service provision. Finally, the chapter proposes potential research agendas to advance studies on co-production using Japan as a case study…(More)”.
Boosting: Empowering Citizens with Behavioral Science
Paper by Stefan M. Herzog and Ralph Hertwig: “…Behavioral public policy came to the fore with the introduction of nudging, which aims to steer behavior while maintaining freedom of choice. Responding to critiques of nudging (e.g., that it does not promote agency and relies on benevolent choice architects), other behavioral policy approaches focus on empowering citizens. Here we review boosting, a behavioral policy approach that aims to foster people’s agency, self-control, and ability to make informed decisions. It is grounded in evidence from behavioral science showing that human decision making is not as notoriously flawed as the nudging approach assumes. We argue that addressing the challenges of our time—such as climate change, pandemics, and the threats to liberal democracies and human autonomy posed by digital technologies and choice architectures—calls for fostering capable and engaged citizens as a first line of response to complement slower, systemic approaches…(More)”.
The Tyranny of Now
Essay by Nicholas Carr: “…Communication systems are also transportation systems. Each medium carries information from here to there, whether in the form of thoughts and opinions, commands and decrees, or artworks and entertainments.
What Innis saw is that some media are particularly good at transporting information across space, while others are particularly good at transporting it through time. Some are space-biased while others are time-biased. Each medium’s temporal or spatial emphasis stems from its material qualities. Time-biased media tend to be heavy and durable. They last a long time, but they are not easy to move around. Think of a gravestone carved out of granite or marble. Its message can remain legible for centuries, but only those who visit the cemetery are able to read it. Space-biased media tend to be lightweight and portable. They’re easy to carry, but they decay or degrade quickly. Think of a newspaper printed on cheap, thin stock. It can be distributed in the morning to a large, widely dispersed readership, but by evening it’s in the trash.

Bernard Gagnon / Wikimedia
Because every society organizes and sustains itself through acts of communication, the material biases of media do more than determine how long messages last or how far they reach. They play an important role in shaping a society’s size, form, and character — and ultimately its fate. As the sociologist Andrew Wernick explained in a 1999 essay on Innis, “The portability of media influences the extent, and the durability of media the longevity, of empires, institutions, and cultures.”
In societies where time-biased media are dominant, the emphasis is on tradition and ritual, on maintaining continuity with the past. People are held together by shared beliefs, often religious or mythologic, passed down through generations. Elders are venerated, and power typically resides in a theocracy or monarchy. Because the society lacks the means to transfer knowledge and exert influence across a broad territory, it tends to remain small and insular. If it grows, it does so in a decentralized fashion, through the establishment of self-contained settlements that hold the same traditions and beliefs…(More)”
Data sharing restrictions are hampering precision health in the European Union
Paper by Cristina Legido-Quigley et al: “Contemporary healthcare is undergoing a transition, shifting from a population-based approach to personalized medicine on an individual level. In October 2023, the European Partnership for Personalized Medicine was officially launched to communicate the benefits of this approach to citizens and healthcare systems in member countries. The main debate revolves around the inconsistency in regulatory changes within personal data access and its potential commercialization. Moreover, the lack of unified consensus within European Union (EU) countries is leading to problems with data sharing to progress personalized medicine. Here we discuss the integration of biological data with personal information on a European scale for the advancement of personalized medicine, raising legal considerations of data protection under the EU General Data Protection Regulation (GDPR)…(More)”.