How science could aid the US quest for environmental justice


Jeff Tollefson at Nature: “…The network of US monitoring stations that detect air pollution catches only broad trends across cities and regions, and isn’t equipped for assessing air quality at the level of streets and neighbourhoods. So environmental scientists are exploring ways to fill the gaps.

In one project funded by NASA, researchers are developing methods to assess street-level pollution using measurements of aerosols and other contaminants from space. When the team trained its tools on Washington DC, the scientists found1 that sections in the city’s southeast, which have a larger share of Black residents, are exposed to much higher levels of fine-soot pollution than wealthier — and whiter — areas in the northwest of the city, primarily because of the presence of major roads and bus depots in the southeast.

Cumulative burden: Air-pollution levels tend to be higher in poorer and predominantly Black neighbourhoods of Washington DC.
Source: Ref. 1

The detailed pollution data painted a more accurate picture of the burden on a community that also lacks access to high-quality medical facilities and has high rates of cardiovascular disorders and other diseases. The results help to explain a more than 15-year difference in life expectancy between predominantly white neighbourhoods and some predominantly Black ones.

The analysis underscores the need to consider pollution and socio-economic data in parallel, says Susan Anenberg, director of the Climate and Health Institute at the George Washington University in Washington DC and co-leader of the project. “We can actually get neighbourhood-scale observations from space, which is quite incredible,” she says, “but if you don’t have the demographic, economic and health data as well, you’re missing a very important piece of the puzzle.”

Other projects, including one from technology company Aclima, in San Francisco, California, are focusing on ubiquitous, low-cost sensors that measure air pollution at the street level. Over the past few years, Aclima has deployed a fleet of vehicles to collect street-level data on air pollutants such as soot and greenhouse gases across 101 municipalities in the San Francisco Bay area. Their data have shown that air-pollution levels can vary as much as 800% from one neighbourhood block to the next.

Working directly with disadvantaged communities and environmental regulators in California, as well as with other states and localities, the company provides pollution monitoring on a subscription basis. It also offers the use of its screening tool, which integrates a suite of socio-economic data and can be used to assess cumulative impacts…(More)”.

More than just information: what does the public want to know about climate change?


Paper by Michael Murunga et all: “Public engagement on climate change is a vital concern for both science and society. Despite more people engaging with climate change science today, there remains a high-level contestation in the public sphere regarding scientific credibility and identifying information needs, interests, and concerns of the non-technical public. In this paper, we present our response to these challenges by describing the use of a novel “public-powered” approach to engaging the public through submitting questions of interest about climate change to climate researchers before a planned engagement activity. Employing thematic content analysis on the submitted questions, we describe how those people we engaged with are curious about understanding climate change science, including mitigating related risks and threats by adopting specific actions. We assert that by inviting the public to submit their questions of interest to researchers before an engagement activity, this step can inform why and transform how actors engage in reflexive dialogue…(More)”.

Making forest data fair and open


Paper by Renato A. F. de Lima : “It is a truth universally acknowledged that those in possession of time and good fortune must be in want of information. Nowhere is this more so than for tropical forests, which include the richest and most productive ecosystems on Earth. Information on tropical forest carbon and biodiversity, and how these are changing, is immensely valuable, and many different stakeholders wish to use data on tropical and subtropical forests. These include scientists, governments, nongovernmental organizations and commercial interests, such as those extracting timber or selling carbon credits. Another crucial, often-ignored group are the local communities for whom forest information may help to assert their rights and conserve or restore their forests.

A widespread view is that to lead to better public outcomes it is necessary and sufficient for forest data to be open and ‘Findable, Accessible, Interoperable, Reusable’ (FAIR). There is indeed a powerful case. Open data — those that anyone can use and share without restrictions — can encourage transparency and reproducibility, foster innovation and be used more widely, thus translating into a greater public good (for example, https://creativecommons.org). Open biological collections and genetic sequences such as GBIF or GenBank have enabled species discovery, and open Earth observation data helps people to understand and monitor deforestation (for example, Global Forest Watch). But the perspectives of those who actually make the forest measurements are much less recognized, meaning that open and FAIR data can be extremely unfair indeed. We argue here that forest data policies and practices must be fair in the correct, linguistic use of the term — just and equitable.

In a world in which forest data origination — measuring, monitoring and sustaining forest science — is secured by large, long-term capital investment (such as through space missions and some officially supported national forest inventories), making all data open makes perfect sense. But where data origination depends on insecure funding and precarious employment conditions, top-down calls to make these data open can be deeply problematic. Even when well-intentioned, such calls ignore the socioeconomic context of the places where the forest plots are located and how knowledge is created, entrenching the structural inequalities that characterize scientific research and collaboration among and within nations. A recent review found scant evidence for open data ever lessening such inequalities. Clearly, only a privileged part of the global community is currently able to exploit the potential of open forest data. Meanwhile, some local communities are de facto owners of their forests and associated knowledge, so making information open — for example, the location of valuable species — may carry risks to themselves and their forests….(More)”.

Is AI Good for the Planet?


Book by Benedetta Brevini: “Artificial intelligence (AI) is presented as a solution to the greatest challenges of our time, from global pandemics and chronic diseases to cybersecurity threats and the climate crisis. But AI also contributes to the climate crisis by running on technology that depletes scarce resources and by relying on data centres that demand excessive energy use.

Is AI Good for the Planet? brings the climate crisis to the centre of debates around AI, exposing its environmental costs and forcing us to reconsider our understanding of the technology. It reveals why we should no longer ignore the environmental problems generated by AI. Embracing a green agenda for AI that puts the climate crisis at centre stage is our urgent priority.

Engaging and passionately written, this book is essential reading for scholars and students of AI, environmental studies, politics, and media studies and for anyone interested in the connections between technology and the environment…(More)”.

Befriending Trees to Lower a City’s Temperature


Peter Wilson at the New York Times: “New York, Denver, Shanghai, Ottawa and Los Angeles have all unveiled Million Tree Initiatives aimed at greatly increasing their urban forests because of the ability of trees to reduce city temperatures, absorb carbon dioxide and soak up excess rainfall.

Central Melbourne, on the other hand, lacks those cities’ financial firepower and is planning to plant a little more than 3,000 trees a year over the next decade. Yet it has gained the interest of other cities by using its extensive data to shore up the community engagement and political commitment required to sustain the decades-long work of building urban forests.

A small municipality covering just 14.5 square miles in the center of the greater Melbourne metropolitan area — which sprawls for 3,860 square miles and houses 5.2 million people in 31 municipalities — the city of Melbourne introduced its online map in 2013.

Called the Urban Forest Visual, the map displayed each of the 80,000 trees in its parks and streets, and showed each tree’s age, species and health. It also gave each tree its own email address so that people could help to monitor them and alert council workers to any specific problems.

That is when the magic happened.

City officials were surprised to see the trees receiving thousands of love letters. They ranged from jaunty greetings — “good luck with the photosynthesis” — to love poems and emotional tributes about how much joy the trees brought to people’s lives….(More)”.

Citizen science air quality project in Brussels reveals disparity in pollution levels


Article by Smart Cities World: “A citizen science air quality project in Brussels has revealed a striking disparity in air pollution levels across the city.

It shows socio-economically vulnerable neighbourhoods more likely to suffer from poor air quality. The dataset also shows air quality in the city has improved, but there is still a major health impact.

Between 25 September and 23 October 2021, 3,000 citizens participated in CurieuzenAir, the largest ever citizen science project on air quality in the Belgium capital…

The project is an initiative of the University of Antwerp, urban movement BRAL and Université libre de Bruxelles, in close cooperation with Brussels Environnement, De Standaard, Le Soir and Bruzz. This programme is supported by Bloomberg Philanthropies’ Brussels Clean Air Partnership.

For one month, citizen scientists mapped the concentration of nitrogen dioxide (NO2) – a key indicator of air pollution caused by traffic – in their streets via measuring tubes on the facades of their homes.

The project resulted in a unique dataset showing the impact of road traffic on air quality in Brussels in great detail. Results range from ‘excellent’ to “extremely poor” air quality across Brussels, with a stark contrast in air quality between socio-economically vulnerable neighbourhoods and green, well-off ones.

An interactive dot map shows how the air quality differs greatly from neighbourhood to neighbourhood, and even from street to street. From blue dots (0-15 µg m-3; “very good”) to a number of jet-black dots (>50 µg m-3; “extremely bad”), the CurieuzenAir dataset makes it clear that these differences are explained by emissions from Brussels traffic….

Alain Maron, Brussels minister for climate transition, environment, social affairs and health, said: “CurieuzenAir is a great example of the importance of citizen science. Thanks to all the citizens that took part in the project, we collected unprecedented results on air pollution in Brussels, which help us to better understand the problem in our city.

“While we see that the situation is slowly improving, the concentrations measured still remain unacceptable, and call for urgent, in-depth action. We need to make sure that everyone in the city, wherever they live and whatever they earn, get to breathe a clean and healthy air.”…(More)”.

Repeat photos show change in southern African landscapes: a citizen science project


Paper by Timm Hoffman and Hana Petersen: “Every place in the world has a history. To understand it in the present you need some knowledge of its past. The history of the earth can be read from its rocks; the history of life, from the evolutionary histories and relationships of its species. But what of the history of modern landscapes and the many benefits we derive from them, such as water and food? What are their histories – and how are they shifting in response to the intense pressures they face from climate change and from people?

Historical landscape photographs provide one way of measuring this. They capture the way things were at a moment in time. By standing at the same place and re-photographing the same scene, it is possible to document the nature of change. Sometimes researchers can even measure the extent and rate of change for different elements in the landscape.

Reasons for the change can also sometimes be observed from this and other historical information, such as the climate or fire record. All of these data can then be related to what has been written about environmental change using other approaches and models. Researchers can ascertain whether the environment has reached a critical threshold and consider how to respond to the changes.

This is what repeat photography is all about…

The rePhotoSA project was launched in August 2015. The idea is to involve interested members of the public in re-photographing historical locations. This has two benefits. First, participants add to the number of repeated images. Second, public awareness of landscape change is raised.

The project website has over 6,000 historical images from ten primary photographic collections of southern African landscapes, dating from the late 1800s to the early 2000s. The geographic spread of the photographs is influenced largely by the interests of the original photographers. Often these photographs are donated to the project by family members, or institutions to which the original photographers belonged – and sometimes by the photographers themselves….(More)

The Staggering Ecological Impacts of Computation and the Cloud


Essay by Steven Gonzalez Monserrate: “While in technical parlance the “Cloud” might refer to the pooling of computing resources over a network, in popular culture, “Cloud” has come to signify and encompass the full gamut of infrastructures that make online activity possible, everything from Instagram to Hulu to Google Drive. Like a puffy cumulus drifting across a clear blue sky, refusing to maintain a solid shape or form, the Cloud of the digital is elusive, its inner workings largely mysterious to the wider public, an example of what MIT cybernetician Norbert Weiner once called a “black box.” But just as the clouds above us, however formless or ethereal they may appear to be, are in fact made of matter, the Cloud of the digital is also relentlessly material.

To get at the matter of the Cloud we must unravel the coils of coaxial cables, fiber optic tubes, cellular towers, air conditioners, power distribution units, transformers, water pipes, computer servers, and more. We must attend to its material flows of electricity, water, air, heat, metals, minerals, and rare earth elements that undergird our digital lives. In this way, the Cloud is not only material, but is also an ecological force. As it continues to expand, its environmental impact increases, even as the engineers, technicians, and executives behind its infrastructures strive to balance profitability with sustainability. Nowhere is this dilemma more visible than in the walls of the infrastructures where the content of the Cloud lives: the factory libraries where data is stored and computational power is pooled to keep our cloud applications afloat….

To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from “dirty” electricity grids, especially in Virginia’s “data center alley,” the site of 70 percent of the world’s internet traffic in 2019. To cool, the Cloud burns carbon, what Jeffrey Moro calls an “elemental irony.” In most data centers today, cooling accounts for greater than 40 percent of electricity usage….(More)”.

The Immaculate Conception of Data: Agribusiness, Activists, and Their Shared Politics of the Future


Book by  Kelly Bronson: “Every new tractor now contains built-in sensors that collect data and stream it to cloud-based infrastructure. Seed and chemical companies are using these data, and these agribusinesses are a form of big tech alongside firms like Google and Facebook.

The Immaculate Conception of Data peeks behind the secretive legal agreements surrounding agricultural big data to trace how it is used and with what consequences. Agribusinesses are among the oldest oligopoly corporations in the world, and their concentration gives them an advantage over other food system actors. Kelly Bronson explores what happens when big data get caught up in pre-existing arrangements of power. Her richly ethnographic account details the work of corporate scientists, farmers using the data, and activist “hackers” building open-source data platforms. Actors working in private and public contexts have divergent views on whom new technology is for, how it should be developed, and what kinds of agriculture it should support. Surprisingly, despite their differences, these groups share a way of speaking about data and its value for the future. Bronson calls this the immaculate conception of data, arguing that this phenomenon is a dangerous framework for imagining big data and what it might do for society.

Drawing our attention to agriculture as an important new site for big tech criticism, The Immaculate Conception of Data uniquely bridges science and technology studies, critical data studies, and food studies, bringing to light salient issues related to data justice and a sustainable food system…(More)”.

How climate data scarcity costs lives


Paula Dupraz-Dobias at New Humanitarian: “Localised data can help governments project climate forecasts, prepare for disasters as early as possible, and create long-term policies for adapting to climate change.

Wealthier countries tend to have better access to new technology that allows for more accurate predictions, such as networks of temperature, wind, and atmospheric pressure sensors.

But roughly half the world’s countries do not have multi-hazard early warning systems, according to the UN’s World Meteorological Organization. Some 60 percent lack basic water information services designed to gather and analyse data on surface, ground, and atmospheric water, which could help reduce flooding and better manage water. Some 43 percent do not communicate or interact adequately with other countries to share potentially life-saving information.

The black holes in weather data around the globe

Availability of surface land observations (Map)WMO/ECMWFUS reports weather observations every three hours, as opposed to the every hour required by World Meteorological Organization regulations. It says it will comply with these from 2023.

See WIGOS’s full interactive map

“Right now, we can analyse weather; in other words, what happens today, tomorrow, and the day after,” said Ena Jaimes Espinoza, a weather expert at CENEPRED, Peru’s national centre for disaster monitoring, prevention, and risk reduction. “For climate data, where you need years of data, there is still a dearth [of information].”

Without this information, she said, it’s difficult to establish accurate trends in different areas of the country – trends that could help forecasters better predict conditions in Tarucani, for example, or help policymakers to plan responses.

Inadequate funding, poor data-sharing between countries, and conflict, at least in some parts of the world, contribute to the data shortfalls. Climate experts warn that some of the world’s most disaster-vulnerable countries risk being left behind as this information gap widens…(More)”.