Repeat photos show change in southern African landscapes: a citizen science project


Paper by Timm Hoffman and Hana Petersen: “Every place in the world has a history. To understand it in the present you need some knowledge of its past. The history of the earth can be read from its rocks; the history of life, from the evolutionary histories and relationships of its species. But what of the history of modern landscapes and the many benefits we derive from them, such as water and food? What are their histories – and how are they shifting in response to the intense pressures they face from climate change and from people?

Historical landscape photographs provide one way of measuring this. They capture the way things were at a moment in time. By standing at the same place and re-photographing the same scene, it is possible to document the nature of change. Sometimes researchers can even measure the extent and rate of change for different elements in the landscape.

Reasons for the change can also sometimes be observed from this and other historical information, such as the climate or fire record. All of these data can then be related to what has been written about environmental change using other approaches and models. Researchers can ascertain whether the environment has reached a critical threshold and consider how to respond to the changes.

This is what repeat photography is all about…

The rePhotoSA project was launched in August 2015. The idea is to involve interested members of the public in re-photographing historical locations. This has two benefits. First, participants add to the number of repeated images. Second, public awareness of landscape change is raised.

The project website has over 6,000 historical images from ten primary photographic collections of southern African landscapes, dating from the late 1800s to the early 2000s. The geographic spread of the photographs is influenced largely by the interests of the original photographers. Often these photographs are donated to the project by family members, or institutions to which the original photographers belonged – and sometimes by the photographers themselves….(More)

The Staggering Ecological Impacts of Computation and the Cloud


Essay by Steven Gonzalez Monserrate: “While in technical parlance the “Cloud” might refer to the pooling of computing resources over a network, in popular culture, “Cloud” has come to signify and encompass the full gamut of infrastructures that make online activity possible, everything from Instagram to Hulu to Google Drive. Like a puffy cumulus drifting across a clear blue sky, refusing to maintain a solid shape or form, the Cloud of the digital is elusive, its inner workings largely mysterious to the wider public, an example of what MIT cybernetician Norbert Weiner once called a “black box.” But just as the clouds above us, however formless or ethereal they may appear to be, are in fact made of matter, the Cloud of the digital is also relentlessly material.

To get at the matter of the Cloud we must unravel the coils of coaxial cables, fiber optic tubes, cellular towers, air conditioners, power distribution units, transformers, water pipes, computer servers, and more. We must attend to its material flows of electricity, water, air, heat, metals, minerals, and rare earth elements that undergird our digital lives. In this way, the Cloud is not only material, but is also an ecological force. As it continues to expand, its environmental impact increases, even as the engineers, technicians, and executives behind its infrastructures strive to balance profitability with sustainability. Nowhere is this dilemma more visible than in the walls of the infrastructures where the content of the Cloud lives: the factory libraries where data is stored and computational power is pooled to keep our cloud applications afloat….

To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from “dirty” electricity grids, especially in Virginia’s “data center alley,” the site of 70 percent of the world’s internet traffic in 2019. To cool, the Cloud burns carbon, what Jeffrey Moro calls an “elemental irony.” In most data centers today, cooling accounts for greater than 40 percent of electricity usage….(More)”.

The Immaculate Conception of Data: Agribusiness, Activists, and Their Shared Politics of the Future


Book by  Kelly Bronson: “Every new tractor now contains built-in sensors that collect data and stream it to cloud-based infrastructure. Seed and chemical companies are using these data, and these agribusinesses are a form of big tech alongside firms like Google and Facebook.

The Immaculate Conception of Data peeks behind the secretive legal agreements surrounding agricultural big data to trace how it is used and with what consequences. Agribusinesses are among the oldest oligopoly corporations in the world, and their concentration gives them an advantage over other food system actors. Kelly Bronson explores what happens when big data get caught up in pre-existing arrangements of power. Her richly ethnographic account details the work of corporate scientists, farmers using the data, and activist “hackers” building open-source data platforms. Actors working in private and public contexts have divergent views on whom new technology is for, how it should be developed, and what kinds of agriculture it should support. Surprisingly, despite their differences, these groups share a way of speaking about data and its value for the future. Bronson calls this the immaculate conception of data, arguing that this phenomenon is a dangerous framework for imagining big data and what it might do for society.

Drawing our attention to agriculture as an important new site for big tech criticism, The Immaculate Conception of Data uniquely bridges science and technology studies, critical data studies, and food studies, bringing to light salient issues related to data justice and a sustainable food system…(More)”.

How climate data scarcity costs lives


Paula Dupraz-Dobias at New Humanitarian: “Localised data can help governments project climate forecasts, prepare for disasters as early as possible, and create long-term policies for adapting to climate change.

Wealthier countries tend to have better access to new technology that allows for more accurate predictions, such as networks of temperature, wind, and atmospheric pressure sensors.

But roughly half the world’s countries do not have multi-hazard early warning systems, according to the UN’s World Meteorological Organization. Some 60 percent lack basic water information services designed to gather and analyse data on surface, ground, and atmospheric water, which could help reduce flooding and better manage water. Some 43 percent do not communicate or interact adequately with other countries to share potentially life-saving information.

The black holes in weather data around the globe

Availability of surface land observations (Map)WMO/ECMWFUS reports weather observations every three hours, as opposed to the every hour required by World Meteorological Organization regulations. It says it will comply with these from 2023.

See WIGOS’s full interactive map

“Right now, we can analyse weather; in other words, what happens today, tomorrow, and the day after,” said Ena Jaimes Espinoza, a weather expert at CENEPRED, Peru’s national centre for disaster monitoring, prevention, and risk reduction. “For climate data, where you need years of data, there is still a dearth [of information].”

Without this information, she said, it’s difficult to establish accurate trends in different areas of the country – trends that could help forecasters better predict conditions in Tarucani, for example, or help policymakers to plan responses.

Inadequate funding, poor data-sharing between countries, and conflict, at least in some parts of the world, contribute to the data shortfalls. Climate experts warn that some of the world’s most disaster-vulnerable countries risk being left behind as this information gap widens…(More)”.

Automating the War on Noise Pollution


Article by Linda Poon: “Any city dweller is no stranger to the frequent revving of motorbikes and car engines, made all the more intolerable after the months of silence during pandemic lockdowns. Some cities have decided to take action. 

Paris police set up an anti-noise patrol in 2020 to ticket motorists whose vehicles exceed a certain decibel level, and soon, the city will start piloting the use of noise sensors in two neighborhoods. Called Medusa, each device uses four microphones to detect and measure noise levels, and two cameras to help authorities track down the culprit. No decibel threshold or fines will be set during the three-month trial period, according to French newspaper Liberation, but it’ll test the potentials and limits of automating the war on sound pollution.

Cities like Toronto and Philadelphia are also considering deploying similar tools. By now, research has been mounting about the health effects of continuous noise exposure, including links to high blood pressure and heart disease, and to poor mental health. And for years, many cities have been tackling noise through ordinances and urban design, including various bans on leaf blowers, on construction at certain hours and on cars. Some have even hired “night mayors” to, among other things, address complaints about after-hours noise.

But enforcement, even with the help of simple camera-and-noise radars, has been a challenge. Since 2018,  the Canadian city of Edmonton has been piloting the use of four radars attached to light poles at busy intersections in the downtown area. A 2021 report on the second phase of the project completed in 2020, found that officials had to manually sift through the data to take out noise made by, say, sirens. And the recordings didn’t always provide strong enough evidence against the offender in court. It was also costly: The pilot cost taxpayers $192,000, while fines generated a little more than half that amount, according to CTV News Edmonton.

Those obstacles have made noise pollution an increasingly popular target for smart city innovation, with companies and researchers looking to make environmental monitoring systems do more than just measure decibel levels…(More)”.

A tale of two labs: Rethinking urban living labs for advancing citizen engagement in food system transformations


Paper by Anke Brons et al: “Citizen engagement is heralded as essential for food democracy and equality, yet the implementation of inclusive citizen engagement mechanisms in urban food systems governance has lagged behind. This paper aims to further the agenda of citizen engagement in the transformation towards healthy and sustainable urban food systems by offering a conceptual reflection on urban living labs (ULLs) as a methodological platform. Over the past decades, ULLs have become increasingly popular to actively engage citizens in methodological testbeds for innovations within real-world settings. The paper proposes that ULLs as a tool for inclusive citizen engagement can be utilized in two ways: (i) the ULL as the daily life of which citizens are the experts, aimed at uncovering the unreflexive agency of a highly diverse population in co-shaping the food system and (ii) the ULL as a break with daily life aimed at facilitating reflexive agency in (re)shaping food futures. We argue that both ULL approaches have the potential to facilitate inclusive citizen engagement in different ways by strengthening the breadth and the depth of citizen engagement respectively. The paper concludes by proposing a sequential implementation of the two types of ULL, paying attention to spatial configurations and the short-termed nature of ULLs….(More)”.

Cities and the Climate-Data Gap


Article by Robert Muggah and Carlo Ratti: “With cities facing disastrous climate stresses and shocks in the coming years, one would think they would be rushing to implement mitigation and adaptation strategies. Yet most urban residents are only dimly aware of the risks, because their cities’ mayors, managers, and councils are not collecting or analyzing the right kinds of information.

With more governments adopting strategies to reduce greenhouse-gas (GHG) emissions, cities everywhere need to get better at collecting and interpreting climate data. More than 11,000 cities have already signed up to a global covenant to tackle climate change and manage the transition to clean energy, and many aim to achieve net-zero emissions before their national counterparts do. Yet virtually all of them still lack the basic tools for measuring progress.

Closing this gap has become urgent, because climate change is already disrupting cities around the world. Cities on almost every continent are being ravaged by heat waves, fires, typhoons, and hurricanes. Coastal cities are being battered by severe flooding connected to sea-level rise. And some megacities and their sprawling peripheries are being reconsidered altogether, as in the case of Indonesia’s $34 billion plan to move its capital from Jakarta to Borneo by 2024.

Worse, while many subnational governments are setting ambitious new green targets, over 40% of cities (home to some 400 million people) still have no meaningful climate-preparedness strategy. And this share is even lower in Africa and Asia – where an estimated 90% of all future urbanization in the next three decades is expected to occur.

We know that climate-preparedness plans are closely correlated with investment in climate action including nature-based solutions and systematic resilience. But strategies alone are not enough. We also need to scale up data-driven monitoring platforms. Powered by satellites and sensors, these systems can track temperatures inside and outside buildings, alert city dwellers to air-quality issues, and provide high-resolution information on concentrations of specific GHGs (carbon dioxide and nitrogen dioxide) and particulate matter…(More)”.

The AI Carbon Footprint and Responsibilities of AI Scientists


Paper by Guglielmo Tamburrini: “This article examines ethical implications of the growing AI carbon footprint, focusing on the fair distribution of prospective responsibilities among groups of involved actors. First, major groups of involved actors are identified, including AI scientists, AI industry, and AI infrastructure providers, from datacenters to electrical energy suppliers. Second, responsibilities of AI scientists concerning climate warming mitigation actions are disentangled from responsibilities of other involved actors. Third, to implement these responsibilities nudging interventions are suggested, leveraging on AI competitive games which would prize research combining better system accuracy with greater computational and energy efficiency. Finally, in addition to the AI carbon footprint, it is argued that another ethical issue with a genuinely global dimension is now emerging in the AI ethics agenda. This issue concerns the threats that AI-powered cyberweapons pose to the digital command, control, and communication infrastructure of nuclear weapons systems…(More)”.

Climate Change and AI: Recommendations for Government


Press Release: “A new report, developed by the Centre for AI & Climate and Climate Change AI for the Global Partnership on AI (GPAI), calls for governments to recognise the potential for artificial intelligence (AI) to accelerate the transition to net zero, and to put in place the support needed to advance AI-for-climate solutions. The report is being presented at COP26 today.

The report, Climate Change and AI: Recommendations for Government, highlights 48 specific recommendations for how governments can both support the application of AI to climate challenges and address the climate-related risks that AI poses.

The report was commissioned by the Global Partnership on AI (GPAI), a partnership between 18 countries and the EU that brings together experts from across countries and sectors to help shape the development of AI.

AI is already being used to support climate action in a wide range of use cases, several of which the report highlights. These include:

  • National Grid ESO, which has used AI to double the accuracy of its forecasts of UK electricity demand. Radically improving forecasts of electricity demand and renewable energy generation will be critical in enabling greater proportions of renewable energy on electricity grids.
  • The UN Satellite Centre (UNOSAT), which has developed the FloodAI system that delivers high-frequency flood reports. FloodAI’s reports, which use a combination of satellite data and machine learning, have improved the response to climate-related disasters in Asia and Africa.
  • Climate TRACE, a global coalition of organizations, which has radically improved the transparency and accuracy of emissions monitoring by leveraging AI algorithms and data from more than 300 satellites and 11,000 sensors.

The authors also detail critical bottlenecks that are impeding faster adoption. To address these, the report calls for governments to:

  • Improve data ecosystems in sectors critical to climate transition, including the development of digital twins in e.g. the energy sector.
  • Increase support for research, innovation, and deployment through targeted funding, infrastructure, and improved market designs.
  • Make climate change a central consideration in AI strategies to shape the responsible development of AI as a whole.
  • Support greater international collaboration and capacity building to facilitate the development and governance of AI-for-climate solutions….(More)”.

Countries’ climate pledges built on flawed data


Article by Chris Mooney, Juliet Eilperin, Desmond Butler, John Muyskens, Anu Narayanswamy, and Naema Ahmed: “Across the world, many countries underreporttheir greenhouse gas emissions in their reports to the United Nations, a Washington Post investigation has found. An examination of 196 country reports reveals a giant gap between what nations declare their emissions to be versus the greenhouse gases they are sending into the atmosphere. The gap ranges from at least 8.5 billion to as high as 13.3 billion tons a year of underreported emissions — big enough to move the needle on how much the Earth will warm.

The plan to save the world from the worst of climate change is built on data. But the data the world is relying on is inaccurate.

“If we don’t know the state of emissions today, we don’t know whether we’re cutting emissions meaningfully and substantially,” said Rob Jackson, a professor at Stanford University and chair of the Global Carbon Project, a collaboration of hundreds of researchers. “The atmosphere ultimately is the truth. The atmosphere is what we care about. The concentration of methane and other greenhouse gases in the atmosphere is what’s affecting climate.”

At the low end, the gap is larger than the yearly emissions of the United States. At the high end, it approaches the emissions of China and comprises 23 percent of humanity’s total contribution to the planet’s warming, The Post found…

A new generation of sophisticated satellites that can measure greenhouse gases are now orbiting Earth, and they can detect massive methane leaks. Data from the International Energy Agency (IEA) lists Russia as the world’s top oil and gas methane emitter, but that’s not what Russia reports to the United Nations. Its official numbers fall millions of tons shy of what independent scientific analyses show, a Post investigation found. Many oil and gas producers in the Persian Gulf region, such as the United Arab Emirates and Qatar, also report very small levels of oil and gas methane emission that don’t line up with other scientific data sets.

“It’s hard to imagine how policymakers are going to pursue ambitious climate actions if they’re not getting the right data from national governments on how big the problem is,” said Glenn Hurowitz, chief executive of Mighty Earth, an environmental advocacy group….(More)”.