Data for Peace and Humanitarian Response? The Case of the Ukraine-Russia War


Article by Behruz Davletov, Uma Kalkar, Salwa Mansuri, Marine Ragnet, and Stefaan Verhulst at Data & Policy: “Since the outbreak of hostilities between Russia and Ukraine on 24 February 2022, more than 4,889 (28,081 according to the Ukrainian government) civilians have been killed and over 7 million people have been displaced. The conflict has had a significant impact on civilians, particularly women and children. In response to the crisis, local and international organizations have sought to provide immediate humanitarian assistance, and initiated numerous initiatives to monitor violations and work toward peacebuilding and conflict resolution.

As in other areas of society, data and data science have become important to tailor, conduct, and monitor emergency responses in conflict zones. Data has also become crucial to support humanitarian action and peacebuilding. For example, data collected from satellite, GPS, and drone technologies can be used to map a conflict’s evolution, understand the needs of civilians, evaluate migration patterns, analyze discourses coming from both sides, and track the delivery of assistance.

This article focuses on the role that data has played in crisis response and peacebuilding related to the Russian-Ukrainian war so as to demonstrate how data can be used for peace. We consider a variety of publicly available evidence to examine various aspects of how data is playing a role in the ongoing conflict, mainly from a humanitarian response perspective. In particular, we consider the following aspects and taxonomy of data usage:

  • Prediction: Data is used to monitor and plan for likely events and risks both prior to and during the conflict;
  • Narratives: Data plays a critical role in both constructing and countering misinformation and disinformation;
  • Infrastructure Damage: Data can be used to track and respond to infrastructure damage, as well as to associated human rights violations and migration flows;
  • Human Rights Violations and Abuses: Data is used to identify and report human rights abuses, and to help construct a legal basis for justice;
  • Migration Flows: Large-scale population flows, both within Ukraine and toward neighboring countries, are one of the defining features of the conflict. Data is being used to monitor these flows, and to target humanitarian assistance;
  • Humanitarian Response: In addition to the above, data is also being used for a wide variety of humanitarian purposes, including ensuring basic and medical supplies, and addressing the resulting mental health crisis….(More)”.

Transforming public policy with engaged scholarship: better together


Blog by Alana Cattapan & Tobin LeBlanc Haley: “The expertise of people with lived experience is receiving increased attention within policy making arenas. Yet consultation processes have, for the most part, been led by public servants, with limited resources provided for supporting the community engagement vital to the inclusion of lived experience experts in policy making. What would policy decisions look like if the voices of the communities who live with the consequences of these decisions were prioritised not only in consultation processes, but in determining priorities and policy processes from the outset? This is one of the questions we explore in our recent article published in the special issue on Transformational Change in Public Policy.

As community-engaged policy researchers, along with Leah LevacLaura Pin, Ethel Tungohan and Sarah Marie Wiebe, our attention has been focused on how to engage meaningfully and work together with the communities impacted by our research, the very communities often systematically excluded from policy processes. Across our different research programmes, we work together with people experiencing precarious housing and homelessnessmigrant workersnorthern and Indigenous womenFirst Nations, and trans and gender diverse people. The lessons we have learned in our research with these communities are useful for our work and for these communities, as well as for policy makers and other actors wanting to engage meaningfully with community stakeholders.

Our new article, “Transforming Public Policy with Engaged Scholarship: Better Together,” describes these lessons, showing how engaged scholarship can inform the meaningful inclusion of people with lived expertise in public policy making. We draw on Marianne Beaulieu, Mylaine Breton and Astrid Brouselle’s work to focus on four principles of engaged scholarship. The principles we focus on include prioritising community needs, practicing reciprocity, recognising multiple ways of knowing, and crossing disciplinary and sectoral boundaries. Using five vignettes from our own research, we link these principles to our practice, highlighting how policy makers can do the same. In one vignette, co-author Sarah Marie Wiebe describes how her research with people in Aamjiwnaang in Canada was made possible through the sustained time and effort of relationship building and learning about the lived experiences of community members. As she explains in the article, this work included sensing the pollution in the surrounding atmosphere firsthand through participation in a “toxic tour” of the community’s location next to Canada’s Chemical Valley. In another vignette, co-author Ethel Tungohan details how migrant community leaders led a study looking at migrant workers’ housing precarity, enabling more responsive forms of engagement with municipal policy makers who tend to ignore migrant workers’ housing issues….(More)”.

The Decentralized Web: Hope or Hype?


Article by Inga Trauthig: “The heavy financial losses of cryptocurrency holders in recent months have catapulted a relatively niche tech topic into public view. However, many investors originally did not emphasize economic gains as their primary motivation for supporting cryptocurrencies. A different motive was driving them: decentralization.

Cryptocurrencies, together with blockchain, belong to a broader field related to the decentralized Web (DWeb) or Web3, which is, however, characterized by some obscurity. In August 2022, many informed readers are likely to be able to explain bitcoin, but fewer will be able to explain differences between various DWeb services, or how content moderation on a new version of the internet works — or could work in future.

The DWeb currently is a movement of which some parts are heavily tied to blockchain as a revolutionary technology purported to resolve the current ills of the internet. But some in the movement disagree on the dogma of blockchain (together with incentive stimulus and game theory) as the Web’s saviour — while concurring on the basic tenet that the current internet space, Web 2.0, has been corrupted by centralization. In other words, the DWeb is a movement whose members share many ideals but differ in their approaches to achieving them. And, some parts of this movement have much broader reach than others. While bitcoin has swept the globe and managed to draw adherents in the Global North and South, social media DWeb services are still mostly used by the technological cognoscenti.

In effect, at the current stage, successes of a decentralized Web are few and far between. They relate to two main aspirations: first, the empirical (re-)decentralization of the internet, and second, an appeal to make the internet a good place (again). The latter is certainly tempting given that the Web 2.0 is regularly accused of enabling authoritarian movements and actors, or online radicalization…(More)”.

Measuring human rights: facing a necessary challenge


Essay by Eduardo Burkle: “Given the abundance of data available today, many assume the world already has enough accurate metrics on human rights performance. However, the political sensitivity of human rights has proven a significant barrier to access. Governments often avoid producing and sharing this type of information.

States’ compliance with their human rights obligations often receives a lot of attention. But there is still much discussion about how to measure it. At the same time, statistics and data increasingly drive political and bureaucratic decisions. This, in turn, brings some urgency to the task of ensuring the best possible data are available.

Establishing cross-national human rights measures is vital for research, advocacy, and policymaking. It can also have a direct effect on people’s enjoyment of human rights. Good data allow states and actors to evaluate how well their country is performing. It also lets them make comparisons that highlight which policies and institutions are truly effective in promoting human rights.

Good human rights data does more than simply evaluate how well a country is performing – it also identifies which policies and institutions are truly effective in promoting human rights

Such context makes it crucial to arm researchers, journalists, advocates, practitioners, investors, and companies with reliable information when raising human rights issues in their countries, and around the world…(More)”.

The fear of technology-driven unemployment and its empirical base


Article by Kerstin Hötte, Melline Somers and Angelos Theodorakopoulos:”New technologies may replace human labour, but can simultaneously create jobs if workers are needed to use these technologies or if new economic activities emerge. At the same time, technology-driven productivity growth may increase disposable income, stimulating a demand-induced employment expansion. Based on a systematic review of the empirical literature on technological change and its impact on employment published in the past four decades, this column suggests that the empirical support for the labour-creating effects of technological change dominates that for labour-replacement…(More)”.

Inside India’s plan to train 3.1 million 21st century civil servants


Article by Anirudh Dinesh and Beth Simone Noveck: “Prime Minister Modi established the Government of India’s Capacity Building Commission (CBC) on April 1, 2021 to reimagine how the state can deliver high-quality citizen services. According to the Commission’s chairman, Adil Zainulbhai and its secretary, Hemang Jani, the Commission will work with 93 central government departments and more than 800 training institutions across India to train over three million central government employees.

The competencies that civil servants are trained in should not be defined from the top down

By training employees, especially those who interact with citizens on a daily basis like those in the railways and postal departments, the hope is to impart new ways of working that translate into more effective and trustworthy government and better quality interactions with residents. The Commission has set itself two “north stars” or stretch goals to accomplish, namely to contribute to improving the “ease of living” for citizens and to advance Prime Minister Modi’s vision to make India a $5 trillion economy…

The Capacity Building Commission’s philosophy is that the competencies that civil servants are trained in should not be defined from the top down. Rather, the Commission wants each ministry to answer: What is the single most important thing we need to accomplish and then define the competencies they need to achieve that goal. …

An important first step in creating a capacity building programme is to understand what competencies already exist (or not) in the civil service. We asked both Zainulbhai and Jani about the CBC’s thinking about creating such a baseline of skills. The Commission’s approach, Jani explained to us, is to ask each ministry to look at its training needs from three “lenses:”

  1. Does the ministry have the capacity to deliver on “national priorities”? And are government employees aware of these national priorities?
  2. Does the ministry have the capacity necessary to deliver “citizen-centric” services?
  3. The “technology lens”: Do civil servants not only understand the challenges posed by technology but also appreciate new technologies and the solutions that could come from them?

The Commission also looks at capacity building on three levels:

  1. The individual level: What knowledge, skill and attitude an individual needs.
  2. The organisation level: What rules and procedures might be hindering service delivery.
  3. The institutional level: How to create an enabling environment for employees to upskill themselves resulting in better public services…(More)”

We don’t have a hundred biases, we have the wrong model


Blog by Jason Collins: “…Behavioral economics today is famous for its increasingly large collection of deviations from rationality, or, as they are often called, ‘biases’. While useful in applied work, it is time to shift our focus from collecting deviations from a model of rationality that we know is not true. Rather, we need to develop new theories of human decision to progress behavioral economics as a science. We need heliocentrism. 

The dominant model of human decision-making across many disciplines, including my own, economics, is the rational-actor model. People make decisions based on their preferences and the constraints that they face. Whether implicitly or explicitly, they typically have the computational power to calculate the best decision and the willpower to carry it out. It’s a fiction but a useful one.

As has become broadly known through the growth of behavioral economics, there are many deviations from this model. (I am going to use the term behavioral economics through this article as a shorthand for the field that undoubtedly extends beyond economics to social psychology, behavioral science, and more.) This list of deviations has grown to the extent that if you visit the Wikipedia page ‘List of Cognitive Biases’ you will now see in excess of 200 biases and ‘effects’. These range from the classics described in the seminal papers of Amos Tversky and Daniel Kahneman through to the obscure.

We are still at the collection-of-deviations stage. There are not 200 human biases. There are 200 deviations from the wrong model…(More)”

Who Is Falling for Fake News?


Article by Angie Basiouny: “People who read fake news online aren’t doomed to fall into a deep echo chamber where the only sound they hear is their own ideology, according to a revealing new study from Wharton.

Surprisingly, readers who regularly browse fake news stories served up by social media algorithms are more likely to diversify their news diet by seeking out mainstream sources. These well-rounded news junkies make up more than 97% of online readers, compared with the scant 2.8% who consume online fake news exclusively.

“We find that these echo chambers that people worry about are very shallow. This idea that the internet is creating an echo chamber is just not holding out to be true,” said Senthil Veeraraghavan, a Wharton professor of operations, information and decisions.

Veeraraghavan is co-author of the paper, “Does Fake News Create Echo Chambers?” It was also written by Ken Moon, Wharton professor of operations, information and decisions, and Jiding Zhang, an assistant operations management professor at New York University Shanghai who earned her doctorate at Wharton.

The study, which examined the browsing activity of nearly 31,000 households during 2017, offers empirical evidence that goes against popular beliefs about echo chambers. While echo chambers certainly are dark and dangerous places, they aren’t metaphorical black holes that suck in every person who reads an article about, say, Obama birtherism theory or conspiracies about COVID-19 vaccines. The study found that households exposed to fake news actually increase their exposure to mainstream news by 9.1%.

“We were surprised, although we were very aware going in that there was much that we did not know,” Moon said. “One thing we wanted to see is how much fake news is out there. How do we figure out what’s fake and what’s not, and who is producing the fake news and why? The economic structure of that matters from a business perspective.”…(More)”

Whither Nudge? The Debate Itself Offers Lessons on the Influence of Social Science


Blog by Tony Hockley: “Pursuing impact can be a disturbing balancing act between spin and substance. Underdo the spin whilst maintaining substance and the impact will likely be zero, but credibility is upheld. Overdo the spin and risk the substance being diluted by marketing and misappropriation. The story of Nudge offers insights into what can happen when research has an unpredictably large impact in the world of politics and policy.

Has Nudge overdone the spin, and how much is a one-word book title to blame if it has? It is certainly true that the usual academic balancing act of spin versus substance was tipped by a publisher’s suggestion of snappy title instead of the usual academic tongue-twister intelligible only to the initiated. Under the title Nudge the book found a receptive audience of policymakers looking to fix problems easily and on the cheap after the 2008 economic crash, and a public policy community eager to adopt exciting new terminology into their own areas of interest. ‘Behavioural Insights Teams’ quickly sprang up around the world, dubbed (very inaccurately) as “nudge units.” There was little discernible push-back against this high-level misappropriation of the term, the general excitement, and the loss of strict definition attached to the authors’ underlying concept for nudge policies of “libertarian paternalism.” In short, the authors had lost control of their own work. The book became a global bestseller. In 2021 it was updated and republished, in what was described as “the final edition.” Perhaps in recognition that the concept had stretched to the end of its logical road?…(More)”.

Using Wikipedia for conflict forecasting


Article by Christian Oswald and Daniel Ohrenhofer: “How can we improve our ability to predict conflicts? Scholars have struggled with this question for a long time. However, as a discipline, and especially over the last two decades, political science has made substantial progress. In general, what we need to improve predictions are advances in data and methodology. Data advances involve both improving the quality of existing data and developing new data sources. We propose a new data source for conflict forecasting efforts: Wikipedia.

The number of country page views indicates international salience of, or interest in, a country. Meanwhile, the number of changes to a country page indicate political controversy between opposing political views.

We took part in the Violence Early-Warning System’s friendly competition to predict changes in battle-related deaths. In our work, we evaluate our findings with out-of-sample predictions using held-out, previously unseen data, and true forecasts into the future. We find support for the predictive power of country page views, whereas we do not for page changes…

Globally available data, updated monthly, are ideal for (near) real-time forecasting. However, many commonly used data sources are available only annually. They are updated once a year, often with considerable delay.

Some of these variables, such as democracy or GDP, tend to be relatively static over time. Furthermore, many data sources face the problem of missing values. These occur when it is not possible to find reliable data for a variable for a given country.

Wikipedia is updated in real time, unlike many commonly used data sources, which may update only annually and with considerable delay

More recent data sources such as Twitter, images or text as data, or mobile phone data, often do not provide global coverage. What’s more, collecting and manipulating data from such sources is typically computationally and/or financially costly. Wikipedia provides an alternative data source that, to some extent, overcomes many of these limitations…(More)”.