Big data for whom? Data-driven estimates to prioritize the recovery needs of vulnerable populations after a disaster


Blog and paper by Sabine Loos and David Lallemant: “For years, international agencies have been effusing the benefits of big data for sustainable development. Emerging technology–such as crowdsourcing, satellite imagery, and machine learning–have the power to better inform decision-making, especially those that support the 17 Sustainable Development Goals. When a disaster occurs, overwhelming amounts of big data from emerging technology are produced with the intention to support disaster responders. We are seeing this now with the recent earthquakes in Turkey and Syria: space agencies are processing satellite imagery to map faults and building damage or digital humanitarians are crowdsourcing baseline data like roads and buildings.

Eight years ago, the Nepal 2015 earthquake was no exception–emergency managers received maps of shaking or crowdsourced maps of affected people’s needs from diverse sources. A year later, I began research with a team of folks involved during the response to the earthquake, and I was determined to understand how big data produced after disasters were connected to the long-term effects of the earthquake. Our research team found that a lot of data that was used to guide the recovery focused on building damage, which was often viewed as a proxy for population needs. While building damage information is useful, it does not capture the full array of social, environmental, and physical factors that will lead to disparities in long-term recovery. I assumed information would have been available immediately after the earthquake that was aimed at supporting vulnerable populations. However, as I spent time in Nepal during the years after the 2015 earthquake, speaking with government officials and nongovernmental organizations involved in the response and recovery, I found they lacked key information about the needs of the most vulnerable households–those who would face the greatest obstacles during the recovery from the earthquake. While governmental and nongovernmental actors prioritized the needs of vulnerable households as best as possible with the information available, I was inspired to pursue research that could provide better information more quickly after an earthquake, to inform recovery efforts.

In our paper published in Communications Earth and Environment [link], we develop a data-driven approach to rapidly estimate which areas are likely to fall behind during recovery due to physical, environmental, and social obstacles. This approach leverages survey data on recovery progress combined with geospatial datasets that would be readily available after an event that represent factors expected to impede recovery. To identify communities with disproportionate needs long after a disaster, we propose focusing on those who fall behind in recovery over time, or non-recovery. We focus on non-recovery since it places attention on those who do not recover rather than delineating the characteristics of successful recovery. In addition, in speaking to several groups in Nepal involved in the recovery, they understood vulnerability–a concept that is place-based and can change over time–as those who would not be able to recover due to the earthquake…(More)”

Big Data and Public Policy


Book by Rebecca Moody and Victor Bekkers: “This book provides a comprehensive overview of how the course, content and outcome of policy making is affected by big data. It scrutinises the notion that big and open data makes policymaking a more rational process, in which policy makers are able to predict, assess and evaluate societal problems. It also examines how policy makers deal with big data, the problems and limitations they face, and how big data shapes policymaking on the ground. The book considers big data from various perspectives, not just the political, but also the technological, legal, institutional and ethical dimensions. The potential of big data use in the public sector is also assessed, as well as the risks and dangers this might pose. Through several extended case studies, it demonstrates the dynamics of big data and public policy. Offering a holistic approach to the study of big data, this book will appeal to students and scholars of public policy, public administration and data science, as well as those interested in governance and politics…(More)”.

Data Free Disney


Essay by Janet Vertesy: “…Once upon a time, you could just go to Disneyland. You could get tickets at the gates, stand in line for rides, buy food and tchotchkes, even pick up copies of your favorite Disney movies at a local store. It wasn’t even that long ago. The last time I visited, in 2010, the company didn’t record what I ate for dinner or detect that I went on Pirates of the Caribbean five times. It was none of their business.

But sometime in the last few years, tracking and tracing became their business. Like many corporations out there, Walt Disney Studios spent the last decade transforming into a data company.

The theme parks alone are a data scientist’s dream. Just imagine: 50,000 visitors a day, most equipped with cell phones and a specialized app. Millions of location traces, along with rides statistics, lineup times, and food-order preferences. Thousands and thousands of credit card swipes, each populating a database with names and addresses, each one linking purchases across the park grounds.1 A QR-code scavenger hunt that records the path people took through Star Wars: Galaxy’s Edge. Hotel keycards with entrance times, purchases, snack orders, and more. Millions of photos snapped on rides and security cameras throughout the park, feeding facial-recognition systems. Tickets with names, birthdates, and portraits attached. At Florida’s Disney World, MagicBands—bracelets using RFID (radio-frequency identification) technology—around visitors’ wrists gather all that information plus fingerprints in one place, while sensors ambiently detect their every move. What couldn’t you do with all that data?…(More)”.

Big Data and the Law of War


Essay by Paul Stephan: “Big data looms large in today’s world. Much of the tech sector regards the building up of large sets of searchable data as part (sometimes the greater part) of its business model. Surveillance-oriented states, of which China is the foremost example, use big data to guide and bolster monitoring of their own people as well as potential foreign threats. Many other states are not far behind in the surveillance arms race, notwithstanding the attempts of the European Union to put its metaphorical finger in the dike. Finally, ChatGPT has revived popular interest in artificial intelligence (AI), which uses big data as a means of optimizing the training and algorithm design on which it depends, as a cultural, economic, and social phenomenon. 

If big data is growing in significance, might it join territory, people, and property as objects of international conflict, including armed conflict? So far it has not been front and center in Russia’s invasion of Ukraine, the war that currently consumes much of our attention. But future conflicts could certainly feature attacks on big data. China and Taiwan, for example, both have sophisticated technological infrastructures that encompass big data and AI capabilities. The risk that they might find themselves at war in the near future is larger than anyone would like. What, then, might the law of war have to say about big data? More generally, if existing law does not meet our needs,  how might new international law address the issue?

In a recent essay, part of an edited volume on “The Future Law of Armed Conflict,” I argue that big data is a resource and therefore a potential target in an armed conflict. I address two issues: Under the law governing the legality of war (jus ad bellum), what kinds of attacks on big data might justify an armed response, touching off a bilateral (or multilateral) armed conflict (a war)? And within an existing armed conflict, what are the rules (jus in bello, also known as international humanitarian law, or IHL) governing such attacks?

The distinction is meaningful. If cyber operations rise to the level of an armed attack, then the targeted state has, according to Article 51 of the U.N. Charter, an “inherent right” to respond with armed force. Moreover, the target need not confine its response to a symmetrical cyber operation. Once attacked, a state may use all forms of armed force in response, albeit subject to the restrictions imposed by IHL. If the state regards, say, a takedown of its financial system as an armed attack, it may respond with missiles…(More)”.

Science and the World Cup: how big data is transforming football


Essay by David Adam: “The scowl on Cristiano Ronaldo’s face made international headlines last month when the Portuguese superstar was pulled from a match between Manchester United and Newcastle with 18 minutes left to play. But he’s not alone in his sentiment. Few footballers agree with a manager’s decision to substitute them in favour of a fresh replacement.

During the upcoming football World Cup tournament in Qatar, players will have a more evidence-based way to argue for time on the pitch. Within minutes of the final whistle, tournament organizers will send each player a detailed breakdown of their performance. Strikers will be able to show how often they made a run and were ignored. Defenders will have data on how much they hassled and harried the opposing team when it had possession.

It’s the latest incursion of numbers into the beautiful game. Data analysis now helps to steer everything from player transfers and the intensity of training, to targeting opponents and recommending the best direction to kick the ball at any point on the pitch.

Meanwhile, footballers face the kind of data scrutiny more often associated with an astronaut. Wearable vests and straps can now sense motion, track position with GPS and count the number of shots taken with each foot. Cameras at multiple angles capture everything from headers won to how long players keep the ball. And to make sense of this information, most elite football teams now employ data analysts, including mathematicians, data scientists and physicists plucked from top companies and labs such as computing giant Microsoft and CERN, Europe’s particle-physics laboratory near Geneva, Switzerland….(More)”.

Ethical Considerations in Re-Using Private Sector Data for Migration-Related Policy


IOM practitioner’s paper: “This paper assesses the ethical risks of using non-traditional data sources to inform migration related policymaking and suggests practical safeguards for various stages during the data cycle. The past decade has witnessed the rapid growth of non-traditional data (social media, mobile phones, satellite data, bank records, etc.) and their use in migration research and policy. While these data sources may be tempting and shed light on main migration trends, ensuring the ethical and responsible use of big data at every stage of migration research and policymaking is complex.

The recognition of the potential of new data sources for migration policy has grown exponentially in recent years. Data innovation is one of the crosscutting priorities of IOM’s Migration Data Strategy.
Further, the UN General Assembly recognises rapid technological developments and their potential in
achieving the Sustainable Development Goals and the Global Compact for Safe, Orderly and Regular Migration highlights the importance of harnessing data innovation to improve data and evidence for informed policies on migration. However, with big data comes big risks. New technological developments have opened new challenges, particularly, concerning data protection, individual privacy, human security,
and fundamental rights. These risks can be greater for certain migrant and displaced groups.
The identified risks are:…(More)” (see also Big Data for Migration Alliance)

Big Data and Official Statistics


Paper by Katharine G. Abraham: “The infrastructure and methods for developed countries’ economic statistics, largely established in the mid-20th century, rest almost entirely on survey and administrative data. The increasing difficulty of obtaining survey responses threatens the sustainability of this model. Meanwhile, users of economic data are demanding ever more timely and granular information. “Big data” originally created for other purposes offer the promise of new approaches to the compilation of economic data. Drawing primarily on the U.S. experience, the paper considers the challenges to incorporating big data into the ongoing production of official economic statistics and provides examples of progress towards that goal to date. Beyond their value for the routine production of a standard set of official statistics, new sources of data create opportunities to respond more nimbly to emerging needs for information. The concluding section of the paper argues that national statistical offices should expand their mission to seize these opportunities…(More)”.

Decisions Over Decimals: Striking the Balance between Intuition and Information


Book by Christopher J. Frank, Paul F. Magnone, Oded Netzer: “Agile decision making is imperative as you lead in a data-driven world. Amid streams of data and countless meetings, we make hasty decisions, slow decisions, and often no decisions. Uniquely bridging theory and practice, Decision over Decimals breaks this pattern by uniting data intelligence with human judgment to get to action – a sharp approach the authors refer to as Quantitative Intuition (QI). QI raises the power of thinking beyond big data without neglecting it and chasing the perfect decision while appreciating that such a thing can never really exist….(More)”.

Smart Streetlights are Casting a Long Shadow Over Our Cities


Article by Zhile Xie: “This is not a surveillance system—nobody is watching it 24 hours a day,” said Erik Caldwell, director of economic development in San Diego, in an interview where he was asked if the wide deployment of “smart” streetlights had turned San Diego into a surveillance city. Innocuous at first glance, this statement demonstrates the pernicious impact of artificial intelligence on new “smart” streetlight systems. As Caldwell suggests, a central human vision is important for the streetlight to function as a surveillance instrument. However, the lack of human supervision only suggests its enhanced capacity. Smart sensors are able to process and communicate environmental information that does not present itself in a visual format and does not rely on human interpretation. On the one hand, they reinforce streetlights’ function as a surveillance instrument, historically associated with light and visibility. On the other hand, in tandem with a wide range of sensors embedded in our everyday environment, they also enable for-profit data extraction on a vast scale,  under the auspices of a partnership between local governments and tech corporations. 

The streetlight was originally designed as a surveillance device and has been refined to that end ever since then. Its association with surveillance and security can be found as early as 400 BC. Citizens of Ancient Rome started to install an oil lamp in front of every villa to prevent tripping or thefts, and an enslaved person would be designated to watch the lamp—lighting was already paired with the notion of control through slavery. As Wolfgang Schivelbusch has detailed in his book Disenchanted Light, street lighting also emerged in medieval European cities alongside practices of policing. Only designated watchmen who carried a torch and a weapon were allowed to be out on the street. This ancient connection between security and visibility has been the basis of the wide deployment of streetlights in modern cities. Moreover, as Edwin Heathcote has explained in a recent article for the Architectural Review, gas streetlights were first introduced to Paris during Baron Haussmann’s restructuring of the city between 1853 and 1870, which was designed in part to prevent revolutionary uprisings. The invention of electric light bulbs in the late nineteenth century in Europe triggered new fears and imaginations around the use of streetlights for social control. For instance, in his 1894 dystopian novel The Land of the Changing Sun, W.N. Harben envisions an electric-optical device that makes possible 24-hour surveillance over the entire population of an isolated country, Alpha. The telescopic system is aided by an artificial “sun” that lights up the atmosphere all year round, along with networked observatories across the land that capture images of their surroundings, which are transmitted to a “throne room” for inspection by the king and police…(More)”.

Landsat turns 50: How satellites revolutionized the way we see – and protect – the natural world


Article by Stacy Morford: “Fifty years ago, U.S. scientists launched a satellite that dramatically changed how we see the world.

It captured images of Earth’s surface in minute detail, showing how wildfires burned landscapes, how farms erased forests, and many other ways humans were changing the face of the planet.

The first satellite in the Landsat series launched on July 23, 1972. Eight others followed, providing the same views so changes could be tracked over time, but with increasingly powerful instruments. Landsat 8 and Landsat 9 are orbiting the planet today, and NASA and the U.S. Geological Survey are planning a new Landsat mission.

The images and data from these satellites are used to track deforestation and changing landscapes around the world, locate urban heat islands, and understand the impact of new river dams, among many other projects. Often, the results help communities respond to risks that may not be obvious from the ground.

Here are three examples of Landsat in action, from The Conversation’s archive.

Tracking changes in the Amazon

When work began on the Belo Monte Dam project in the Brazilian Amazon in 2015, Indigenous tribes living along the Big Bend of the Xingu River started noticing changes in the river’s flow. The water they relied on for food and transportation was disappearing.

Upstream, a new channel would eventually divert as much as 80% of the water to the hydroelectric dam, bypassing the bend.

The consortium that runs the dam argued that there was no scientific proof that the change in water flow harmed fish.

But there is clear proof of the Belo Monte Dam project’s impact – from above, write Pritam DasFaisal HossainHörður Helgason and Shahzaib Khan at the University of Washington. Using satellite data from the Landsat program, the team showed how the dam dramatically altered the hydrology of the river…

It’s hot in the city – and even hotter in some neighborhoods

Landsat’s instruments can also measure surface temperatures, allowing scientists to map heat risk street by street within cities as global temperatures rise.

“Cities are generally hotter than surrounding rural areas, but even within cities, some residential neighborhoods get dangerously warmer than others just a few miles away,” writes Daniel P. Johnson, who uses satellites to study the urban heat island effect at Indiana University.

Neighborhoods with more pavement and buildings and fewer trees can be 10 degrees Fahrenheit (5.5 C) or more warmer than leafier neighborhoods, Johnson writes. He found that the hottest neighborhoods tend to be low-income, have majority Black or Hispanic residents and had been subjected to redlining, the discriminatory practice once used to deny loans in racial and ethnic minority communities…(More)”.