Protests


Paper by Davide Cantoni, Andrew Kao, David Y. Yang & Noam Yuchtman: “Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work.Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work.Citizens have long taken to the streets to demand change, expressing political views that may otherwise be suppressed. Protests have produced change at local, national, and international scales, including spectacular moments of political and social transformation. We document five new empirical patterns describing 1.2 million protest events across 218 countries between 1980 and 2020. First, autocracies and weak democracies experienced a trend break in protests during the Arab Spring. Second, protest movements also rose in importance following the Arab Spring. Third, protest movements geographically diffuse over time, spiking to their peak, before falling off. Fourth, a country’s year-to-year economic performance is not strongly correlated with protests; individual values are predictive of protest participation. Fifth, the US, China, and Russia are the most over-represented countries by their share of academic studies. We discuss each pattern’s connections to the existing literature and anticipate paths for future work…(More)”.

Designing Research For Impact


Blog by Duncan Green: “The vast majority of proposals seem to conflate impact with research dissemination (a heroic leap of faith – changing the world one seminar at a time), or to outsource impact to partners such as NGOs and thinktanks.

Of the two, the latter looks more promising, but then the funder should ask to see both evidence of genuine buy-in from the partners, and appropriate budget for the work. Bringing in a couple of NGOs as ‘bid candy’ with little money attached is unlikely to produce much impact.

There is plenty written on how to genuinely design research for impact, e.g. this chapter from a number of Oxfam colleagues on its experience, or How to Engage Policy Makers with your Research (an excellent book I reviewed recently and on the LSE Review of Books). In brief, proposals should:

  • Identify the kind(s) of impacts being sought: policy change, attitudinal shifts (public or among decision makers), implementation of existing laws and policies etc.
  • Provide a stakeholder mapping of the positions of key players around those impacts – supporters, waverers and opponents.
  • Explain how the research plans to target some/all of these different individuals/groups, including during the research process itself (not just ‘who do we send the papers to once they’re published?’).
  • Which messengers/intermediaries will be recruited to convey the research to the relevant targets (researchers themselves are not always the best-placed to persuade them)
  • Potential ‘critical junctures’ such as crises or changes of political leadership that could open windows of opportunity for uptake, and how the research team is set up to spot and respond to them.
  • Anticipated attacks/backlash against research on sensitive issues and how the researchers plan to respond
  • Plans for review and adaptation of the influencing strategy

I am not arguing for proposals to indicate specific impact outcomes – most systems are way too complex for that. But, an intentional plan based on asking questions on the points above would probably help researchers improve their chances of impact.

Based on the conversations I’ve been having, I also have some thoughts on what is blocking progress.

Impact is still too often seen as an annoying hoop to jump through at the funding stage (and then largely forgotten, at least until reporting at the end of the project). The incentives are largely personal/moral (‘I want to make a difference’), whereas the weight of professional incentives are around accumulating academic publications and earning the approval of peers (hence the focus on seminars).

incentives are largely personal/moral (‘I want to make a difference’), whereas the weight of professional incentives are around accumulating academic publications

The timeline of advocacy, with its focus on ‘dancing with the system’, jumping on unexpected windows of opportunity etc, does not mesh with the relentless but slow pressure to write and publish. An academic is likely to pay a price if they drop their current research plans to rehash prior work to take advantage of a brief policy ‘window of opportunity’.

There is still some residual snobbery, at least in some disciplines. You still hear terms like ‘media don’, which is not meant as a compliment. For instance, my friend Ha-Joon Chang is now an economics professor at SOAS, but what on earth was Cambridge University thinking not making a global public intellectual and brilliant mind into a prof, while he was there?

True, there is also some more justified concern that designing research for impact can damage the research’s objectivity/credibility – hence the desire to pull in NGOs and thinktanks as intermediaries. But, this conversation still feels messy and unresolved, at least in the UK…(More)”.

Advancing Environmental Justice with AI


Article by Justina Nixon-Saintil: “Given its capacity to innovate climate solutions, the technology sector could provide the tools we need to understand, mitigate, and even reverse the damaging effects of global warming. In fact, addressing longstanding environmental injustices requires these companies to put the newest and most effective technologies into the hands of those on the front lines of the climate crisis.

Tools that harness the power of artificial intelligence, in particular, could offer unprecedented access to accurate information and prediction, enabling communities to learn from and adapt to climate challenges in real time. The IBM Sustainability Accelerator, which we launched in 2022, is at the forefront of this effort, supporting the development and scaling of projects such as the Deltares Aquality App, an AI-powered tool that helps farmers assess and improve water quality. As a result, farmers can grow crops more sustainably, prevent runoff pollution, and protect biodiversity.

Consider also the challenges that smallholder farmers face, such as rising costs, the difficulty of competing with larger producers that have better tools and technology, and, of course, the devastating effects of climate change on biodiversity and weather patterns. Accurate information, especially about soil conditions and water availability, can help them address these issues, but has historically been hard to obtain…(More)”.

Experts: 90% of Online Content Will Be AI-Generated by 2026


Article by Maggie Harrison: “Don’t believe everything you see on the Internet” has been pretty standard advice for quite some time now. And according to a new report from European law enforcement group Europol, we have all the reason in the world to step up that vigilance.

“Experts estimate that as much as 90 percent of online content may be synthetically generated by 2026,” the report warned, adding that synthetic media “refers to media generated or manipulated using artificial intelligence.”

“In most cases, synthetic media is generated for gaming, to improve services or to improve the quality of life,” the report continued, “but the increase in synthetic media and improved technology has given rise to disinformation possibilities.”…

The report focused pretty heavily on disinformation, notably that driven by deepfake technology. But that 90 percent figure raises other questions, too — what do AI systems like Dall-E and GPT-3 mean for artists, writers, and other content-generating creators? And circling back to disinformation once more, what will the dissemination of information, not to mention the consumption of it, actually look like in an era driven by that degree of AI-generated digital stuff?…(More)’

Unlocking the value of supply chain data across industries


MIT Technology Review Insights: “The product shortages and supply-chain delays of the global covid-19 pandemic are still fresh memories. Consumers and industry are concerned that the next geopolitical climate event may have a similar impact. Against a backdrop of evolving regulations, these conditions mean manufacturers want to be prepared against short supplies, concerned customers, and weakened margins.

For supply chain professionals, achieving a “phygital” information flow—the blending of physical and digital data—is key to unlocking resilience and efficiency. As physical objects travel through supply chains, they generate a rich flow of data about the item and its journey—from its raw materials, its manufacturing conditions, even its expiration date—bringing new visibility and pinpointing bottlenecks.

This phygital information flow offers significant advantages, enhancing the ability to create rich customer experiences to satisfying environmental, social, and corporate governance (ESG) goals. In a 2022 EY global survey of executives, 70% of respondents agreed that a sustainable supply chain will increase their company’s revenue.

For disparate parties to exchange product information effectively, they require a common framework and universally understood language. Among supply chain players, data standards create a shared foundation. Standards help uniquely identify, accurately capture, and automatically share critical information about products, locations, and assets across trading communities…(More)”.

Digital Empires: The Global Battle to Regulate Technology


Book by Anu Bradford: “The global battle among the three dominant digital powers—the United States, China, and the European Union—is intensifying. All three regimes are racing to regulate tech companies, with each advancing a competing vision for the digital economy while attempting to expand its sphere of influence in the digital world. In Digital Empires, her provocative follow-up to The Brussels Effect, Anu Bradford explores a rivalry that will shape the world in the decades to come.

Across the globe, people dependent on digital technologies have become increasingly alarmed that their rapid adoption and transformation have ushered in an exceedingly concentrated economy where a few powerful companies control vast economic wealth and political power, undermine data privacy, and widen the gap between economic winners and losers. In response, world leaders are variously embracing the idea of reining in the most dominant tech companies. Bradford examines three competing regulatory approaches—the American market-driven model, the Chinese state-driven model, and the European rights-driven regulatory model—and discusses how governments and tech companies navigate the inevitable conflicts that arise when these regulatory approaches collide in the international domain. Which digital empire will prevail in the contest for global influence remains an open question, yet their contrasting strategies are increasingly clear.

Digital societies are at an inflection point. In the midst of these unfolding regulatory battles, governments, tech companies, and digital citizens are making important choices that will shape the future ethos of the digital society. Digital Empires lays bare the choices we face as societies and individuals, explains the forces that shape those choices, and illuminates the immense stakes involved for everyone who uses digital technologies….(More)”

AI and new standards promise to make scientific data more useful by making it reusable and accessible


Article by Bradley Wade Bishop: “…AI makes it highly desirable for any data to be machine-actionable – that is, usable by machines without human intervention. Now, scholars can consider machines not only as tools but also as potential autonomous data reusers and collaborators.

The key to machine-actionable data is metadata. Metadata are the descriptions scientists set for their data and may include elements such as creator, date, coverage and subject. Minimal metadata is minimally useful, but correct and complete standardized metadata makes data more useful for both people and machines.

It takes a cadre of research data managers and librarians to make machine-actionable data a reality. These information professionals work to facilitate communication between scientists and systems by ensuring the quality, completeness and consistency of shared data.

The FAIR data principles, created by a group of researchers called FORCE11 in 2016 and used across the world, provide guidance on how to enable data reuse by machines and humans. FAIR data is findable, accessible, interoperable and reusable – meaning it has robust and complete metadata.

In the past, I’ve studied how scientists discover and reuse data. I found that scientists tend to use mental shortcuts when they’re looking for data – for example, they may go back to familiar and trusted sources or search for certain key terms they’ve used before. Ideally, my team could build this decision-making process of experts and remove as many biases as possible to improve AI. The automation of these mental shortcuts should reduce the time-consuming chore of locating the right data…(More)”.

How to improve economic forecasting


Article by Nicholas Gruen: “Today’s four-day weather forecasts are as accurate as one-day forecasts were 30 years ago. Economic forecasts, on the other hand, aren’t noticeably better. Former Federal Reserve chair Ben Bernanke should ponder this in his forthcoming review of the Bank of England’s forecasting.

There’s growing evidence that we can improve. But myopia and complacency get in the way. Myopia is an issue because economists think technical expertise is the essence of good forecasting when, actually, two things matter more: forecasters’ understanding of the limits of their expertise and their judgment in handling those limits.

Enter Philip Tetlock, whose 2005 book on geopolitical forecasting showed how little experts added to forecasting done by informed non-experts. To compare forecasts between the two groups, he forced participants to drop their vague weasel words — “probably”, “can’t be ruled out” — and specify exactly what they were forecasting and with what probability. 

That started sorting the sheep from the goats. The simple “point forecasts” provided by economists — such as “growth will be 3.0 per cent” — are doubly unhelpful in this regard. They’re silent about what success looks like. If I have forecast 3.0 per cent growth and actual growth comes in at 3.2 per cent — did I succeed or fail? Such predictions also don’t tell us how confident the forecaster is.

By contrast, “a 70 per cent chance of rain” specifies a clear event with a precise estimation of the weather forecaster’s confidence. Having rigorously specified the rules of the game, Tetlock has since shown how what he calls “superforecasting” is possible and how diverse teams of superforecasters do even better. 

What qualities does Tetlock see in superforecasters? As well as mastering necessary formal techniques, they’re open-minded, careful, curious and self-critical — in other words, they’re not complacent. Aware, like Socrates, of how little they know, they’re constantly seeking to learn — from unfolding events and from colleagues…(More)”.

Informing the Global Data Future: Benchmarking Data Governance Frameworks


Paper by Sara Marcucci, Natalia González Alarcón, Stefaan G. Verhulst and Elena Wüllhorst: “Data has become a critical trans-national and cross-border resource. Yet, the lack of a well-defined approach to using it poses challenges to harnessing its value. This article explores the increasing importance of global data governance due to the rapid growth of data, and the need for responsible data practices. The purpose of this paper is to compare approaches and identify patterns in the emergent data governance ecosystem within sectors close to the international development field, ultimately presenting key takeaways and reflections on when and why a global data governance framework may be needed. Overall, the paper provides information about the conditions when a more holistic, coordinated transnational approach to data governance may be needed to responsibly manage the global flow of data. The report does this by (a) considering conditions specified by the literature that may be conducive to global data governance, and (b) analyzing and comparing existing frameworks, specifically investigating six key elements: purpose, principles, anchoring documents, data description and lifecycle, processes, and practices. The article closes with a series of final recommendations, which include adopting a broader concept of data stewardship to reconcile data protection and promotion, focusing on responsible reuse of data to unlock socioeconomic value, harmonizing meanings to operationalize principles, incorporating global human rights frameworks to provide common North Stars, unifying key definitions of data, adopting a data lifecycle approach, incorporating participatory processes and collective agency, investing in new professions with specific roles, improving accountability through oversight and compliance mechanisms, and translating recommendations into practical tools…(More)”

It’s like jury duty, but for getting things done


Article by Hollie Russon Gilman and Amy Eisenstein: “Citizens’ assemblies have the potential to repair our broken politics…Imagine a democracy where people come together and their voices are heard and are translated directly into policy. Frontline workers, doctors, teachers, friends, and neighbors — young and old — are brought together in a random, representative sample to deliberate the most pressing issues facing our society. And they are compensated for their time.

The concept may sound radical. But we already use this method for jury duty. Why not try this widely accepted practice to tackle the deepest, most crucial, and most divisive issues facing our democracy?

The idea — known today as citizens’ assemblies — originated in ancient Athens. Instead of a top-down government, Athens used sortition — a system that was horizontal and distributive. The kleroterion, an allotment machine, randomly selected citizens to hold civic office, ensuring that the people had a direct say in their government’s dealings….(More)”.