Collective Bargaining in the Information Economy Can Address AI-Driven Power Concentration


Position paper by Nicholas Vincent, Matthew Prewitt and Hanlin Li: “…argues that there is an urgent need to restructure markets for the information that goes into AI systems. Specifically, producers of information goods (such as journalists, researchers, and creative professionals) need to be able to collectively bargain with AI product builders in order to receive reasonable terms and a sustainable return on the informational value they contribute. We argue that without increased market coordination or collective bargaining on the side of these primary information producers, AI will exacerbate a large-scale “information market failure” that will lead not only to undesirable concentration of capital, but also to a potential “ecological collapse” in the informational commons. On the other hand, collective bargaining in the information economy can create market frictions and aligned incentives necessary for a pro-social, sustainable AI future. We provide concrete actions that can be taken to support a coalitionbased approach to achieve this goal. For example, researchers and developers can establish technical mechanisms such as federated data management tools and explainable data value estimations, to inform and facilitate collective bargaining in the information economy. Additionally, regulatory and policy interventions may be introduced to support trusted data intermediary organizations representing guilds or syndicates of information producers…(More)”.

Human rights centered global governance of quantum technologies: advancing information for all


UNESCO Brief: “The integration of quantum technologies into AI systems introduces greater complexity, requiring stronger policy and technical frameworks that uphold human rights protections. Ensuring that these advancements do not widen existing inequalities or cause environmental harm is crucial.

The  Brief  expands  on  the  “Quantum  technologies  and  their  global  impact:  discussion  paper ”published by UNESCO. The objective of this Brief is to unpack the multiple dimensions of the quantum ecosystem and broadly explore the human rights and policy implications of quantum technologies, with some key findings:

  • While quantum technologies promise advancements of human rights in the areas of encryption, privacy, and security,  they also pose risks to these very domains and related ones such as freedom of expression and access to information
  • Quantum  innovations  will  reshape security,  economic  growth,  and  science, but  without  a robust human  rights-based  framework,  they  risk  deepening  inequalities  and  destabilizing global governance.
  • The quantum  divide  is  emerging  as  a  critical  issue,  with  disparities  in  access  to  technology,  expertise, and infrastructure widening global inequalities. Unchecked, this gap could limit the benefits of quantum advancements for all.
  • The quantum gender divide remains stark—79% of quantum companies have no female senior leaders, and only 1 in 54 quantum job applicants are women.

The Issue Brief provides broad recommendations and targeted actions for stakeholders,emphasizing

human  rights-centered  governance,  awareness,  capacity  building,  and  inclusivity  to  bridge global and gender divides. The key recommendations focus on a comprehensive governance model which must  ensure  a  multistakeholder  approach  that  facilitates,  state  duties,  corporate  accountability, effective remedies for human rights violations, and open standards for equitable access. Prioritizing human  rights  in  global  governance  will  ensure  quantum  innovation  serves  all  of  humanity  while safeguarding fundamental freedoms…(More)”.

In a world first, Brazilians will soon be able to sell their digital data


Article by Gabriel Daros: “Last month, Brazil announced it is rolling out a data ownership pilot that will allow its citizens to manage, own, and profit from their digital footprint — the first such nationwide initiative in the world. 

The project is administered by Dataprev, a state-owned company that provides technological solutions for the government’s social programs. Dataprev is partnering with DrumWave, a California-based data valuation and monetization firm.

Today, “people get nothing from the data they share,” Brittany Kaiser, co-founder of the Own Your Data Foundation and board adviser for DrumWave, told Rest of World. “Brazil has decided its citizens should have ownership rights over their data.”

In monetizing users’ data, Brazil is ahead of the U.S., where a 2019 “data dividend” initiative by California Governor Gavin Newsom never took off. The city of Chicago successfully monetizes government data including transportation and education. If implemented, Brazil’s will be the first public-private partnership that allows citizens, rather than companies, to get a share of the global data market, currently valued at $4 billion and expected to grow to over $40 billion by 2034.

The pilot involves a small group of Brazilians who will use data wallets for payroll loans. When users apply for a new loan, the data in the contract will be collected in the data wallets, which companies will be able to bid on. Users will have the option to opt out. It works much like third-party cookies, but instead of simply accepting or declining, people can choose to make money…(More)”.

Reliable data facilitates better policy implementation


Article by Ganesh Rao and Parul Agarwal: “Across India, state government departments are at the forefront of improving human capabilities through education, health, and nutrition programmes. Their ability to do so effectively depends on administrative (or admin) data1 collected and maintained by their staff. This data is collected as part of regular operations and informs both day-to-day decision-making and long-term policy. While policymaking can draw on (reasonably reliable) sample surveys alone, effective implementation of schemes and services requires accurate individual-level admin data. However, unreliable admin data can be a severe constraint, forcing bureaucrats to rely on intuition, experience, and informed guesses. Improving the reliability of admin data can greatly enhance state capacity, thereby improving governance and citizen outcomes.  

There has been some progress on this front in recent years. For instance, the Jan Dhan-Aadhaar-Mobile (JAM) trinity has significantly improved direct benefit transfer (DBT) mechanisms by ensuring that certain recipient data is reliable. However, challenges remain in accurately capturing the well-being of targeted citizens. Despite significant investments in the digitisation of data collection and management systems, persistent reliability issues undermine the government’s efforts to build a data-driven decision-making culture…

There is growing evidence of serious quality issues in admin data. At CEGIS, we have conducted extensive analyses of admin data across multiple states, uncovering systemic issues in key indicators across sectors and platforms. These quality issues compound over time, undermining both micro-level service delivery and macro-level policy planning. This results in distorted budget allocations, gaps in service provision, and weakened frontline accountability…(More)”.

Some signs of AI model collapse begin to reveal themselves


Article by Steven J. Vaughan-Nichols: “I use AI a lot, but not to write stories. I use AI for search. When it comes to search, AI, especially Perplexity, is simply better than Google.

Ordinary search has gone to the dogs. Maybe as Google goes gaga for AI, its search engine will get better again, but I doubt it. In just the last few months, I’ve noticed that AI-enabled search, too, has been getting crappier.

In particular, I’m finding that when I search for hard data such as market-share statistics or other business numbers, the results often come from bad sources. Instead of stats from 10-Ks, the US Securities and Exchange Commission’s (SEC) mandated annual business financial reports for public companies, I get numbers from sites purporting to be summaries of business reports. These bear some resemblance to reality, but they’re never quite right. If I specify I want only 10-K results, it works. If I just ask for financial results, the answers get… interesting,

This isn’t just Perplexity. I’ve done the exact same searches on all the major AI search bots, and they all give me “questionable” results.

Welcome to Garbage In/Garbage Out (GIGO). Formally, in AI circles, this is known as AI model collapse. In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and “irreversible defects” in performance. The final result? A Nature 2024 paper stated, “The model becomes poisoned with its own projection of reality.”

Model collapse is the result of three different factors. The first is error accumulation, in which each model generation inherits and amplifies flaws from previous versions, causing outputs to drift from original data patterns. Next, there is the loss of tail data: In this, rare events are erased from training data, and eventually, entire concepts are blurred. Finally, feedback loops reinforce narrow patterns, creating repetitive text or biased recommendations…(More)”.

Project Push creates an archive of news alerts from around the world


Article by Neel Dhanesha: “A little over a year ago, Matt Taylor began to feel like he was getting a few too many push notifications from the BBC News app.

It’s a feeling many of us can probably relate to. Many people, myself included, have turned off news notifications entirely in the past few months. Taylor, however, went in the opposite direction.

Instead of turning off notifications, he decided to see how the BBC — the most popular news app in the U.K., where Taylor lives —  compared to other news organizations around the world. So he dug out an old Google Pixel phone, downloaded 61 news apps onto it, and signed up for push notifications on all of them.

As notifications roll in, a custom-built script (made with the help of ChatGPT) uploads their text to a server and a Bluesky page, providing a near real-time view of push notifications from services around the world. Taylor calls it Project Push.

People who work in news “take the front page very seriously,” said Taylor, a product manager at the Financial Times who built Project Push in his spare time. “There are lots of editors who care a lot about that, but actually one of the most important people in the newsroom is the person who decides that they’re going to press a button that sends an immediate notification to millions of people’s phones.”

The Project Push feed is a fascinating portrait of the news today. There are the expected alerts — breaking news, updates to ongoing stories like the wars in Gaza and Ukraine, the latest shenanigans in Washington — but also:

— Updates on infrastructure plans that, without the context, become absolutely baffling (a train will instead be a bus?).

— Naked attempts to increase engagement.

— Culture updates that some may argue aren’t deserving of a push alert from the Associated Press.

— Whatever this is.

Taylor tells me he’s noticed some geographic differences in how news outlets approach push notifications. Publishers based in Asia and the Middle East, for example, send far more notifications than European or American ones; CNN Indonesia alone pushed about 17,000 of the 160,000 or so notifications Project Push has logged over the past year…(More)”.

Trump Taps Palantir to Compile Data on Americans


Article by Sheera Frenkel and Aaron Krolik: “In March, President Trump signed an executive order calling for the federal government to share data across agencies, raising questions over whether he might compile a master list of personal information on Americans that could give him untold surveillance power.

Mr. Trump has not publicly talked about the effort since. But behind the scenes, officials have quietly put technological building blocks into place to enable his plan. In particular, they have turned to one company: Palantir, the data analysis and technology firm.

The Trump administration has expanded Palantir’s work across the federal government in recent months. The company has received more than $113 million in federal government spending since Mr. Trump took office, according to public records, including additional funds from existing contracts as well as new contracts with the Department of Homeland Security and the Pentagon. (This does not include a $795 million contract that the Department of Defense awarded the company last week, which has not been spent.)

Representatives of Palantir are also speaking to at least two other agencies — the Social Security Administration and the Internal Revenue Service — about buying its technology, according to six government officials and Palantir employees with knowledge of the discussions.

The push has put a key Palantir product called Foundry into at least four federal agencies, including D.H.S. and the Health and Human Services Department. Widely adopting Foundry, which organizes and analyzes data, paves the way for Mr. Trump to easily merge information from different agencies, the government officials said…(More)

Creating detailed portraits of Americans based on government data is not just a pipe dream. The Trump administration has already sought access to hundreds of data points on citizens and others through government databases, including their bank account numbers, the amount of their student debt, their medical claims and any disability status…(More)”.

Digital Democracy in a Divided Global Landscape


10 essays by the Carnegie Endowment for International Peace: “A first set of essays analyzes how local actors are navigating the new tech landscape. Lillian Nalwoga explores the challenges and upsides of Starlink satellite internet deployment in Africa, highlighting legal hurdles, security risks, and concerns about the platform’s leadership. As African nations look to Starlink as a valuable tool in closing the digital divide, Nalwoga emphasizes the need to invest in strong regulatory frameworks to safeguard digital spaces. Jonathan Corpus Ong and Dean Jackson analyze the landscape of counter-disinformation funding in local contexts. They argue that there is a “mismatch” between the priorities of funders and the strategies that activists would like to pursue, resulting in “ineffective and extractive workflows.” Ong and Jackson isolate several avenues for structural change, including developing “big tent” coalitions of activists and strategies for localizing aid projects. Janjira Sombatpoonsiri examines the role of local actors in foreign influence operations in Southeast Asia. She highlights three motivating factors that drive local participation in these operations: financial benefits, the potential to gain an edge in domestic power struggles, and the appeal of anti-Western narratives.

A second set of essays explores evolving applications of digital repression…

A third set focuses on national strategies and digital sovereignty debates…

A fourth set explores pressing tech policy and regulatory questions…(More)”.

Representants and International Orders


Book by Alena Drieschova: “Different units of international politics, such as states or the church, cannot be present in their entirety during international interactions. Political rule needs to be represented for international actors to coordinate their activities. Representants (i.e. maps, GDP, buildings, and diplomatic and warfare practices) establish collective understandings about the nature of authority and its configuration. Whilst representants are not exact replica, they highlight and omit certain features from the units they stand in for. In these inclusions and exclusions lies representants’ irreducible effect. This book studies how representants define the units of the international system and position them in relation to each other, thereby generating an international order. When existing representants change, the international order changes because the units are defined differently and stand in different relations to each other. Power is therefore defined differently. Spanning centuries of European history, Alena Drieschova traces the struggles between actors over these representations…(More)”.

Upgrading Democracies with Fairer Voting Methods


Paper by Evangelos Pournaras, et al: “Voting methods are instrumental design element of democracies. Citizens use them to express and aggregate their preferences to reach a collective decision. However, voting outcomes can be as sensitive to voting rules as they are to people’s voting choices. Despite the significance and inter-disciplinary scientific progress on voting methods, several democracies keep relying on outdated voting methods that do not fit modern, pluralistic societies well, while lacking social innovation. Here, we demonstrate how one can upgrade real-world democracies, namely by using alternative preferential voting methods such as cumulative voting and the method of equal shares designed for a proportional representation of voters’ preferences. By rigorously assessing a new participatory budgeting approach applied in the city of Aarau, Switzerland, we unravel the striking voting outcomes of fair voting methods: more winning projects with the same budget and broader geographic and preference representation of citizens by the elected projects, in particular for voters who used to be under-represented, while promoting novel project ideas. We provide profound causal evidence showing that citizens prefer proportional voting methods, which possess strong legitimacy without the need of very technical specialized explanations. We also reveal strong underlying democratic values exhibited by citizens who support fair voting methods such as altruism and compromise. These findings come with a global momentum to unleash a new and long-awaited participation blueprint of how to upgrade democracies…(More)”.