More Companies Are Disclosing Their ESG Data, but Confusion on How Persists

Article by David Breg: “Public companies in the U.S. are increasingly disclosing sustainability information, but many say they find it a challenge to report fundamental climate data that many regulators around the globe likely will require under incoming mandatory reporting standards

Nearly two-thirds of respondents said their company was disclosing environmental, social and governance information, up from 56% in the prior year, according to the annual survey of sustainability officials that WSJ Pro conducted this spring.

However, there was little consensus on which framework to use and respondents highlighted three fundamental types of information as their three biggest environmental reporting challenges: Greenhouse-gas emissions, climate-change risk and energy management.

The proportion of companies disclosing sustainability and ESG information was 63%, up from 56% last year. Those that don’t yet report this data but plan to was 16%, down from 25% last year. About one-fifth of respondents said their organization had no plans to report their progress, virtually unchanged from last year. Breaking that down, a quarter of private companies don’t plan any ESG reporting, while only 7% of public companies felt the same.

Regulators around the globe are finalizing rules that would require companies to publish standardized information after years of patchy voluntary ESG reporting based on a host of frameworks. California’s governor has said he would soon sign that state’s requirements into law. The U.S. Securities and Exchange Commission’s rules are expected later this year. European regulations are already in place and many other countries are also working on standards. The International Sustainability Standards Board hopes its climate framework, completed this past summer, becomes the global baseline

While it is mostly public companies that face mandatory requirements, even private businesses face increased scrutiny of their sustainability and ESG policies from stakeholders including shareholders, eco-conscious consumers, suppliers, insurers and lenders…(More)”.

Computing the Climate: How We Know What We Know About Climate Change

Book by Steve M. Easterbrook: “How do we know that climate change is an emergency? How did the scientific community reach this conclusion all but unanimously, and what tools did they use to do it? This book tells the story of climate models, tracing their history from nineteenth-century calculations on the effects of greenhouse gases, to modern Earth system models that integrate the atmosphere, the oceans, and the land using the full resources of today’s most powerful supercomputers. Drawing on the author’s extensive visits to the world’s top climate research labs, this accessible, non-technical book shows how computer models help to build a more complete picture of Earth’s climate system. ‘Computing the Climate’ is ideal for anyone who has wondered where the projections of future climate change come from – and why we should believe them…(More)”.

The planet is too important to be left to activists: The guiding philosophy of the Climate Majority Project

Article by Jadzia Tedeschi and Rupert Read: “Increasing numbers of people around the world are convinced that human civilisation is teetering on the brink, but that our political “leaders” aren’t levelling with us about just how dire the climate outlook is. Quite a few of us are beginning to imagine collapse. And yet, for the most part, the responses available to individuals who want to take action seem to be limited to either consumer choices (minimising the amount of plastics we buy, using reusable coffee cups, recycling, and so on) or radical protests (such as gluing oneself to roads at busy intersections, disrupting sports matches, splashing soup on priceless art works, and risking imprisonment).

But there must be a space for action between these two alternatives. While the radical tactics of the Extinction Rebellion movement (XR) did succeed in nudging the public conversation concerning the climate and biodiversity crisis toward a new degree of seriousness, these same tactics also alienated people who would otherwise be sympathetic to XR’s cause and managed to give “climate activists” a bad name in the process. To put it simply, the radical tactics of XR could never achieve the kind of broad-based consensus that is needed to meaningfully respond to the current crisis.

We need a coordinated, collective effort at scale, which entails collaborating across social boundaries and political battlelines. If we are to prevent irrecoverable civilisational collapse, we need to demonstrate that taking care of the natural world is in everybody’s interest.

The Climate Majority Project works to inspire, fund, connect, coordinate, and scale citizen-led initiatives in workplaces, local communities, and strategic professional networks to reach beyond the boundaries of activism-as-usual. It is our endeavour to instantiate the kind of ambitious, moderate flank to XR that Rupert Read has previously called for. The plan is to prove the concept in the UK and then go global — albeit at a slower pace; after all, moderation is rarely adorned with fireworks…(More)”.

Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models

Paper by Pengfei Li, Jianyi Yang, Mohammad A. Islam, Shaolei Ren: “The growing carbon footprint of artificial intelligence (AI) models, especially large ones such as GPT-3 and GPT-4, has been undergoing public scrutiny. Unfortunately, however, the equally important and enormous water footprint of AI models has remained under the radar. For example, training GPT-3 in Microsoft’s state-of-the-art U.S. data centers can directly consume 700,000 liters of clean freshwater (enough for producing 370 BMW cars or 320 Tesla electric vehicles) and the water consumption would have been tripled if training were done in Microsoft’s Asian data centers, but such information has been kept as a secret. This is extremely concerning, as freshwater scarcity has become one of the most pressing challenges shared by all of us in the wake of the rapidly growing population, depleting water resources, and aging water infrastructures. To respond to the global water challenges, AI models can, and also should, take social responsibility and lead by example by addressing their own water footprint. In this paper, we provide a principled methodology to estimate fine-grained water footprint of AI models, and also discuss the unique spatial-temporal diversities of AI models’ runtime water efficiency. Finally, we highlight the necessity of holistically addressing water footprint along with carbon footprint to enable truly sustainable AI…(More)”.

Developing Wearable Technologies to Advance Understanding of Precision Environmental Health

Report by the National Academies of Sciences, Engineering, and Medicine: “The rapid proliferation of wearable devices that gather data on physical activity and physiology has become commonplace across various sectors of society. Concurrently, the development of advanced wearables and sensors capable of detecting a multitude of compounds presents new opportunities for monitoring environmental exposure risks. Wearable technologies are additionally showing promise in disease prediction, detection, and management, thereby offering potential advancements in the interdisciplinary fields of both environmental health and biomedicine.

To gain insight into this burgeoning field, on June 1 and 2, 2023, the National Academies of Sciences, Engineering, and Medicine organized a 2-day virtual workshop titled Developing Wearable Technologies to Advance Understanding of Precision Environmental Health. Experts from government, industry, and academia convened to discuss emerging applications and the latest advances in wearable technologies. The workshop aimed to explore the potential of wearables in capturing, monitoring, and predicting environmental exposures and risks to inform precision environmental health…(More)”.

Incentivising open ecological data using blockchain technology

Paper by Robert John Lewis, Kjell-Erik Marstein & John-Arvid Grytnes: “Mindsets concerning data as proprietary are common, especially where data production is resource intensive. Fears of competing research in concert with loss of exclusivity to hard earned data are pervasive. This is for good reason given that current reward structures in academia focus overwhelmingly on journal prestige and high publication counts, and not accredited publication of open datasets. And, then there exists reluctance of researchers to cede control to centralised repositories, citing concern over the lack of trust and transparency over the way complex data are used and interpreted.

To begin to resolve these cultural and sociological constraints to open data sharing, we as a community must recognise that top-down pressure from policy alone is unlikely to improve the state of ecological data availability and accessibility. Open data policy is almost ubiquitous (e.g. the Joint Data Archiving Policy, (JDAP) and while cyber-infrastructures are becoming increasingly extensive, most have coevolved with sub-disciplines utilising high velocity, born digital data (e.g. remote sensing, automated sensor networks and citizen science). Consequently, they do not always offer technological solutions that ease data collation, standardisation, management and analytics, nor provide a good fit culturally to research communities working among the long-tail of ecological science, i.e. science conducted by many individual researchers/teams over limited spatial and temporal scales. Given the majority of scientific funding is spent on this type of dispersed research, there is a surprisingly large disconnect between the vast majority of ecological science and the cyber-infrastructures to support open data mandates, offering a possible explanation to why primary ecological data are reportedly difficult to find…(More)”.

Data can help decarbonize cities – let us explain

Article by Stephen Lorimer and Andrew Collinge: “The University of Birmingham, Alan Turing Institute and Centre for Net Zero are working together, using a tool developed by the Centre, called Faraday, to model a more detailed understanding of energy flows within the district and between it and the neighbouring 8,000 residents. Faraday is a generative AI model trained on one of the UK’s largest smart metre datasets. The model is helping to unlock a more granular view of energy sources and changing energy usage, providing the basis for modelling future energy consumption and local smart grid management.

The partners are investigating the role that trusted data aggregators can play if they can take raw data and desensitize it to a point where it can be shared without eroding consumer privacy or commercial advantage.

Data is central to both initiatives and all cities seeking a renewable energy transition. But there are issues to address, such as common data standards, governance and data competency frameworks (especially across the built environment supply chain)…

Building the governance, standards and culture that delivers confidence in energy data exchange is essential to maximizing the potential of carbon reduction technologies. This framework will ultimately support efficient supply chains and coordinate market activity. There are lessons from the Open Banking initiative, which provided the framework for traditional financial institutions, fintech and regulators to deliver innovation in financial products and services with carefully shared consumer data.

In the energy domain, there are numerous advantageous aspects to data sharing. It helps overcome barriers in the product supply chain, from materials to low-carbon technologies (heat pumps, smart thermostats, electric vehicle chargers etc). Free and Open-Source Software (FOSS) providers can use data to support installers and property owners.

Data interoperability allows third-party products and services to communicate with any end-user device through open or proprietary Internet of Things gateway platforms such as Tuya or IFTTT. A growing bank of post-installation data on the operation of buildings (such as energy efficiency and air quality) will boost confidence in the future quality of retrofits and make for easier decisions on planning approval and grid connections. Finally, data is increasingly considered key in securing the financing and private sector investment crucial to the net zero effort.

None of the above is easy. Organizational and technical complexity can slow progress but cities must be at the forefront of efforts to coordinate the energy data ecosystem and make the case for “data for decarbonization.”…(More)”.

The Eyewitness Community Survey: An Engaging Citizen Science Tool to Capture Reliable Data while Improving Community Participants’ Environmental Health Knowledge and Attitudes

Paper by Melinda Butsch Kovacic: “Many youths and young adults have variable environmental health knowledge, limited understanding of their local environment’s impact on their health, and poor environmentally friendly behaviors. We sought to develop and test a tool to reliably capture data, increase environmental health knowledge, and engage youths as citizen scientists to examine and take action on their community’s challenges. The Eyewitness Community Survey (ECS) was developed through several iterations of co-design. Herein, we tested its performance. In Phase I, seven youths audited five 360° photographs. In Phase II, 27 participants works as pairs/trios and audited five locations, typically 7 days apart. Inter-rater and intra-rater reliability were determined. Changes in participants’ knowledge, attitudes, behaviors, and self-efficacy were surveyed. Feedback was obtained via focus groups. Intra-rater reliability was in the substantial/near-perfect range, with Phase II having greater consistency. Inter-rater reliability was high, with 42% and 63% of Phase I and II Kappa, respectively, in the substantial/near-perfect range. Knowledge scores improved after making observations (p ≤ 0.032). Participants (85%) reported the tool to be easy/very easy to use, with 70% willing to use it again. Thus, the ECS is a mutually beneficial citizen science tool that rigorously captures environmental data and provides engaging experiential learning opportunities…(More)”.

Unleashing the power of data for electric vehicles and charging infrastructure

Report by Thomas Deloison: “As the world moves toward widespread electric vehicle (EV) adoption, a key challenge lies ahead: deploying charging infrastructure rapidly and effectively. Solving this challenge will be essential to decarbonize transport, which has a higher reliance on fossil fuels than any other sector and accounts for a fifth of global carbon emissions. However, the companies and governments investing in charging infrastructure face significant hurdles, including high initial capital costs and difficulties related to infrastructure planning, permitting, grid connections and grid capacity development.

Data has the power to facilitate these processes: increased predictability and optimized planning and infrastructure management go a long way in easing investments and accelerating deployment. Last year, members of the World Business Council for Sustainable Development (WBCSD) demonstrated that digital solutions based on data sharing could reduce carbon emissions from charging by 15% and unlock crucial grid capacity and capital efficiency gains.

Exceptional advances in data, analytics and connectivity are making digital solutions a potent tool to plan and manage transport, energy and infrastructure. Thanks to the deployment of sensors and the rise of connectivity,  businesses are collecting information faster than ever before, allowing for data flows between physical assets. Charging infrastructure operators, automotive companies, fleet operators, energy providers, building managers and governments collect insights on all aspects of electric vehicle charging infrastructure (EVCI), from planning and design to charging experiences at the station.

The real value of data lies in its aggregationThis will require breaking down siloes across industries and enabling digital collaboration. A digital action framework released by WBCSD, in collaboration with Arcadis, Fujitsu and other member companies and partners, introduces a set of recommendations for companies and governments to realize the full potential of digital solutions and accelerate EVCI deployments:

  • Map proprietary data, knowledge gaps and digital capacity across the value chain to identify possible synergies. The highest value potential from digital solutions will lie at the nexus of infrastructure, consumer behavior insights, grid capacity and transport policy. For example, to ensure the deployment of charging stations where they will be most needed and at the right capacity level, it is crucial to plan investments within energy grid capacity, spatial constraints and local projected demand for EVs.
  • Develop internal data collection and storage capacity with due consideration for existing structures for data sharing. A variety of schemes allow actors to engage in data sharing or monetization. Yet, their use is limited by mismatched use of data standards and specification and process uncertainty. Companies must build a strong understanding of these structures internally by providing internal training and guidance, and invest in sound data collection, storage and analysis capacity.
  • Foster a policy environment that supports digital collaboration across sectors and industries. Digital policies must provide incentives and due diligence frameworks to guide data exchanges across industries and support the adoption of common standards and protocols. For instance, it will be crucial to integrate linkages with energy systems and infrastructure beyond roads in the rollout of the European mobility data space…(More)”.

Weather Warning Inequity: Lack of Data Collection Stations Imperils Vulnerable People

Article by Chelsea Harvey: “Devastating floods and landslides triggered by extreme downpours killed hundreds of people in Rwanda and the Democratic Republic of Congo in May, when some areas saw more than 7 inches of rain in a day.

Climate change is intensifying rainstorms throughout much of the world, yet scientists haven’t been able to show that the event was influenced by warming.

That’s because they don’t have enough data to investigate it.

Weather stations are sparse across Africa, making it hard for researchers to collect daily information on rainfall and other weather variables. The data that does exist often isn’t publicly available.

“The main issue in some countries in Africa is funding,” said Izidine Pinto, a senior researcher on weather and climate at the Royal Netherlands Meteorological Institute. “The meteorological offices don’t have enough funding.”

There’s often too little money to build or maintain weather stations, and strapped-for-cash governments often choose to sell the data they do collect rather than make it free to researchers.

That’s a growing problem as the planet warms and extreme weather worsens. Reliable forecasts are needed for early warning systems that direct people to take shelter or evacuate before disasters strike. And long-term climate data is necessary for scientists to build computer models that help make predictions about the future.

The science consortium World Weather Attribution is the latest research group to run into problems. It investigates the links between climate change and individual extreme weather events all over the globe. In the last few months alone, the organization has demonstrated the influence of global warming on extreme heat in South Asia and the Mediterranean, floods in Italy, and drought in eastern Africa.

Most of its research finds that climate change is making weather events more likely to occur or more intense.

The group recently attempted to investigate the influence of climate change on the floods in Rwanda and Congo. But the study was quickly mired in challenges.

The team was able to acquire some weather station data, mainly in Rwanda, Joyce Kimutai, a research associate at Imperial College London and a co-author of the study, said at a press briefing announcing the findings Thursday. But only a few stations provided sufficient data, making it impossible to define the event or to be certain that climate model simulations were accurate…(More)”.