Making forest data fair and open

Paper by Renato A. F. de Lima : “It is a truth universally acknowledged that those in possession of time and good fortune must be in want of information. Nowhere is this more so than for tropical forests, which include the richest and most productive ecosystems on Earth. Information on tropical forest carbon and biodiversity, and how these are changing, is immensely valuable, and many different stakeholders wish to use data on tropical and subtropical forests. These include scientists, governments, nongovernmental organizations and commercial interests, such as those extracting timber or selling carbon credits. Another crucial, often-ignored group are the local communities for whom forest information may help to assert their rights and conserve or restore their forests.

A widespread view is that to lead to better public outcomes it is necessary and sufficient for forest data to be open and ‘Findable, Accessible, Interoperable, Reusable’ (FAIR). There is indeed a powerful case. Open data — those that anyone can use and share without restrictions — can encourage transparency and reproducibility, foster innovation and be used more widely, thus translating into a greater public good (for example, Open biological collections and genetic sequences such as GBIF or GenBank have enabled species discovery, and open Earth observation data helps people to understand and monitor deforestation (for example, Global Forest Watch). But the perspectives of those who actually make the forest measurements are much less recognized, meaning that open and FAIR data can be extremely unfair indeed. We argue here that forest data policies and practices must be fair in the correct, linguistic use of the term — just and equitable.

In a world in which forest data origination — measuring, monitoring and sustaining forest science — is secured by large, long-term capital investment (such as through space missions and some officially supported national forest inventories), making all data open makes perfect sense. But where data origination depends on insecure funding and precarious employment conditions, top-down calls to make these data open can be deeply problematic. Even when well-intentioned, such calls ignore the socioeconomic context of the places where the forest plots are located and how knowledge is created, entrenching the structural inequalities that characterize scientific research and collaboration among and within nations. A recent review found scant evidence for open data ever lessening such inequalities. Clearly, only a privileged part of the global community is currently able to exploit the potential of open forest data. Meanwhile, some local communities are de facto owners of their forests and associated knowledge, so making information open — for example, the location of valuable species — may carry risks to themselves and their forests….(More)”.

Is AI Good for the Planet?

Book by Benedetta Brevini: “Artificial intelligence (AI) is presented as a solution to the greatest challenges of our time, from global pandemics and chronic diseases to cybersecurity threats and the climate crisis. But AI also contributes to the climate crisis by running on technology that depletes scarce resources and by relying on data centres that demand excessive energy use.

Is AI Good for the Planet? brings the climate crisis to the centre of debates around AI, exposing its environmental costs and forcing us to reconsider our understanding of the technology. It reveals why we should no longer ignore the environmental problems generated by AI. Embracing a green agenda for AI that puts the climate crisis at centre stage is our urgent priority.

Engaging and passionately written, this book is essential reading for scholars and students of AI, environmental studies, politics, and media studies and for anyone interested in the connections between technology and the environment…(More)”.

Befriending Trees to Lower a City’s Temperature

Peter Wilson at the New York Times: “New York, Denver, Shanghai, Ottawa and Los Angeles have all unveiled Million Tree Initiatives aimed at greatly increasing their urban forests because of the ability of trees to reduce city temperatures, absorb carbon dioxide and soak up excess rainfall.

Central Melbourne, on the other hand, lacks those cities’ financial firepower and is planning to plant a little more than 3,000 trees a year over the next decade. Yet it has gained the interest of other cities by using its extensive data to shore up the community engagement and political commitment required to sustain the decades-long work of building urban forests.

A small municipality covering just 14.5 square miles in the center of the greater Melbourne metropolitan area — which sprawls for 3,860 square miles and houses 5.2 million people in 31 municipalities — the city of Melbourne introduced its online map in 2013.

Called the Urban Forest Visual, the map displayed each of the 80,000 trees in its parks and streets, and showed each tree’s age, species and health. It also gave each tree its own email address so that people could help to monitor them and alert council workers to any specific problems.

That is when the magic happened.

City officials were surprised to see the trees receiving thousands of love letters. They ranged from jaunty greetings — “good luck with the photosynthesis” — to love poems and emotional tributes about how much joy the trees brought to people’s lives….(More)”.

Citizen science air quality project in Brussels reveals disparity in pollution levels

Article by Smart Cities World: “A citizen science air quality project in Brussels has revealed a striking disparity in air pollution levels across the city.

It shows socio-economically vulnerable neighbourhoods more likely to suffer from poor air quality. The dataset also shows air quality in the city has improved, but there is still a major health impact.

Between 25 September and 23 October 2021, 3,000 citizens participated in CurieuzenAir, the largest ever citizen science project on air quality in the Belgium capital…

The project is an initiative of the University of Antwerp, urban movement BRAL and Université libre de Bruxelles, in close cooperation with Brussels Environnement, De Standaard, Le Soir and Bruzz. This programme is supported by Bloomberg Philanthropies’ Brussels Clean Air Partnership.

For one month, citizen scientists mapped the concentration of nitrogen dioxide (NO2) – a key indicator of air pollution caused by traffic – in their streets via measuring tubes on the facades of their homes.

The project resulted in a unique dataset showing the impact of road traffic on air quality in Brussels in great detail. Results range from ‘excellent’ to “extremely poor” air quality across Brussels, with a stark contrast in air quality between socio-economically vulnerable neighbourhoods and green, well-off ones.

An interactive dot map shows how the air quality differs greatly from neighbourhood to neighbourhood, and even from street to street. From blue dots (0-15 µg m-3; “very good”) to a number of jet-black dots (>50 µg m-3; “extremely bad”), the CurieuzenAir dataset makes it clear that these differences are explained by emissions from Brussels traffic….

Alain Maron, Brussels minister for climate transition, environment, social affairs and health, said: “CurieuzenAir is a great example of the importance of citizen science. Thanks to all the citizens that took part in the project, we collected unprecedented results on air pollution in Brussels, which help us to better understand the problem in our city.

“While we see that the situation is slowly improving, the concentrations measured still remain unacceptable, and call for urgent, in-depth action. We need to make sure that everyone in the city, wherever they live and whatever they earn, get to breathe a clean and healthy air.”…(More)”.

Repeat photos show change in southern African landscapes: a citizen science project

Paper by Timm Hoffman and Hana Petersen: “Every place in the world has a history. To understand it in the present you need some knowledge of its past. The history of the earth can be read from its rocks; the history of life, from the evolutionary histories and relationships of its species. But what of the history of modern landscapes and the many benefits we derive from them, such as water and food? What are their histories – and how are they shifting in response to the intense pressures they face from climate change and from people?

Historical landscape photographs provide one way of measuring this. They capture the way things were at a moment in time. By standing at the same place and re-photographing the same scene, it is possible to document the nature of change. Sometimes researchers can even measure the extent and rate of change for different elements in the landscape.

Reasons for the change can also sometimes be observed from this and other historical information, such as the climate or fire record. All of these data can then be related to what has been written about environmental change using other approaches and models. Researchers can ascertain whether the environment has reached a critical threshold and consider how to respond to the changes.

This is what repeat photography is all about…

The rePhotoSA project was launched in August 2015. The idea is to involve interested members of the public in re-photographing historical locations. This has two benefits. First, participants add to the number of repeated images. Second, public awareness of landscape change is raised.

The project website has over 6,000 historical images from ten primary photographic collections of southern African landscapes, dating from the late 1800s to the early 2000s. The geographic spread of the photographs is influenced largely by the interests of the original photographers. Often these photographs are donated to the project by family members, or institutions to which the original photographers belonged – and sometimes by the photographers themselves….(More)

The Staggering Ecological Impacts of Computation and the Cloud

Essay by Steven Gonzalez Monserrate: “While in technical parlance the “Cloud” might refer to the pooling of computing resources over a network, in popular culture, “Cloud” has come to signify and encompass the full gamut of infrastructures that make online activity possible, everything from Instagram to Hulu to Google Drive. Like a puffy cumulus drifting across a clear blue sky, refusing to maintain a solid shape or form, the Cloud of the digital is elusive, its inner workings largely mysterious to the wider public, an example of what MIT cybernetician Norbert Weiner once called a “black box.” But just as the clouds above us, however formless or ethereal they may appear to be, are in fact made of matter, the Cloud of the digital is also relentlessly material.

To get at the matter of the Cloud we must unravel the coils of coaxial cables, fiber optic tubes, cellular towers, air conditioners, power distribution units, transformers, water pipes, computer servers, and more. We must attend to its material flows of electricity, water, air, heat, metals, minerals, and rare earth elements that undergird our digital lives. In this way, the Cloud is not only material, but is also an ecological force. As it continues to expand, its environmental impact increases, even as the engineers, technicians, and executives behind its infrastructures strive to balance profitability with sustainability. Nowhere is this dilemma more visible than in the walls of the infrastructures where the content of the Cloud lives: the factory libraries where data is stored and computational power is pooled to keep our cloud applications afloat….

To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from “dirty” electricity grids, especially in Virginia’s “data center alley,” the site of 70 percent of the world’s internet traffic in 2019. To cool, the Cloud burns carbon, what Jeffrey Moro calls an “elemental irony.” In most data centers today, cooling accounts for greater than 40 percent of electricity usage….(More)”.

The Immaculate Conception of Data: Agribusiness, Activists, and Their Shared Politics of the Future

Book by  Kelly Bronson: “Every new tractor now contains built-in sensors that collect data and stream it to cloud-based infrastructure. Seed and chemical companies are using these data, and these agribusinesses are a form of big tech alongside firms like Google and Facebook.

The Immaculate Conception of Data peeks behind the secretive legal agreements surrounding agricultural big data to trace how it is used and with what consequences. Agribusinesses are among the oldest oligopoly corporations in the world, and their concentration gives them an advantage over other food system actors. Kelly Bronson explores what happens when big data get caught up in pre-existing arrangements of power. Her richly ethnographic account details the work of corporate scientists, farmers using the data, and activist “hackers” building open-source data platforms. Actors working in private and public contexts have divergent views on whom new technology is for, how it should be developed, and what kinds of agriculture it should support. Surprisingly, despite their differences, these groups share a way of speaking about data and its value for the future. Bronson calls this the immaculate conception of data, arguing that this phenomenon is a dangerous framework for imagining big data and what it might do for society.

Drawing our attention to agriculture as an important new site for big tech criticism, The Immaculate Conception of Data uniquely bridges science and technology studies, critical data studies, and food studies, bringing to light salient issues related to data justice and a sustainable food system…(More)”.

How climate data scarcity costs lives

Paula Dupraz-Dobias at New Humanitarian: “Localised data can help governments project climate forecasts, prepare for disasters as early as possible, and create long-term policies for adapting to climate change.

Wealthier countries tend to have better access to new technology that allows for more accurate predictions, such as networks of temperature, wind, and atmospheric pressure sensors.

But roughly half the world’s countries do not have multi-hazard early warning systems, according to the UN’s World Meteorological Organization. Some 60 percent lack basic water information services designed to gather and analyse data on surface, ground, and atmospheric water, which could help reduce flooding and better manage water. Some 43 percent do not communicate or interact adequately with other countries to share potentially life-saving information.

The black holes in weather data around the globe

Availability of surface land observations (Map)WMO/ECMWFUS reports weather observations every three hours, as opposed to the every hour required by World Meteorological Organization regulations. It says it will comply with these from 2023.

See WIGOS’s full interactive map

“Right now, we can analyse weather; in other words, what happens today, tomorrow, and the day after,” said Ena Jaimes Espinoza, a weather expert at CENEPRED, Peru’s national centre for disaster monitoring, prevention, and risk reduction. “For climate data, where you need years of data, there is still a dearth [of information].”

Without this information, she said, it’s difficult to establish accurate trends in different areas of the country – trends that could help forecasters better predict conditions in Tarucani, for example, or help policymakers to plan responses.

Inadequate funding, poor data-sharing between countries, and conflict, at least in some parts of the world, contribute to the data shortfalls. Climate experts warn that some of the world’s most disaster-vulnerable countries risk being left behind as this information gap widens…(More)”.

Automating the War on Noise Pollution

Article by Linda Poon: “Any city dweller is no stranger to the frequent revving of motorbikes and car engines, made all the more intolerable after the months of silence during pandemic lockdowns. Some cities have decided to take action. 

Paris police set up an anti-noise patrol in 2020 to ticket motorists whose vehicles exceed a certain decibel level, and soon, the city will start piloting the use of noise sensors in two neighborhoods. Called Medusa, each device uses four microphones to detect and measure noise levels, and two cameras to help authorities track down the culprit. No decibel threshold or fines will be set during the three-month trial period, according to French newspaper Liberation, but it’ll test the potentials and limits of automating the war on sound pollution.

Cities like Toronto and Philadelphia are also considering deploying similar tools. By now, research has been mounting about the health effects of continuous noise exposure, including links to high blood pressure and heart disease, and to poor mental health. And for years, many cities have been tackling noise through ordinances and urban design, including various bans on leaf blowers, on construction at certain hours and on cars. Some have even hired “night mayors” to, among other things, address complaints about after-hours noise.

But enforcement, even with the help of simple camera-and-noise radars, has been a challenge. Since 2018,  the Canadian city of Edmonton has been piloting the use of four radars attached to light poles at busy intersections in the downtown area. A 2021 report on the second phase of the project completed in 2020, found that officials had to manually sift through the data to take out noise made by, say, sirens. And the recordings didn’t always provide strong enough evidence against the offender in court. It was also costly: The pilot cost taxpayers $192,000, while fines generated a little more than half that amount, according to CTV News Edmonton.

Those obstacles have made noise pollution an increasingly popular target for smart city innovation, with companies and researchers looking to make environmental monitoring systems do more than just measure decibel levels…(More)”.

A tale of two labs: Rethinking urban living labs for advancing citizen engagement in food system transformations

Paper by Anke Brons et al: “Citizen engagement is heralded as essential for food democracy and equality, yet the implementation of inclusive citizen engagement mechanisms in urban food systems governance has lagged behind. This paper aims to further the agenda of citizen engagement in the transformation towards healthy and sustainable urban food systems by offering a conceptual reflection on urban living labs (ULLs) as a methodological platform. Over the past decades, ULLs have become increasingly popular to actively engage citizens in methodological testbeds for innovations within real-world settings. The paper proposes that ULLs as a tool for inclusive citizen engagement can be utilized in two ways: (i) the ULL as the daily life of which citizens are the experts, aimed at uncovering the unreflexive agency of a highly diverse population in co-shaping the food system and (ii) the ULL as a break with daily life aimed at facilitating reflexive agency in (re)shaping food futures. We argue that both ULL approaches have the potential to facilitate inclusive citizen engagement in different ways by strengthening the breadth and the depth of citizen engagement respectively. The paper concludes by proposing a sequential implementation of the two types of ULL, paying attention to spatial configurations and the short-termed nature of ULLs….(More)”.