2030 Compass CoLab


About: “2030 Compass CoLab invites a group of experts, using an online platform, to contribute their perspectives on potential interactions between the goals in the UN’s 2030 Agenda for Sustainable Development.

By combining the insight of participants who posses broad and diverse knowledge, we hope to develop a richer understanding of how the Sustainable Development Goals (SDGs) may be complementary or conflicting.

Compass 2030 CoLab is part of a larger project, The Agenda 2030 Compass Methodology and toolbox for strategic decision making, funded by Vinnova, Sweden’s government agency for innovation.

Other elements of the larger project include:

  • Deliberations by a panel of experts who will convene in a series of live meetings to undertake in-depth analysis on interactions between the goals. 
  • Quanitative analysis of SDG indicators time series data, which will examine historical correlations between progress on the SDGs.
  • Development of a knowledge repository, residing in a new software tool under development as part of the project. This tool will be made available as a resource to guide the decisions of corporate executives, policy makers, and leaders of NGOs.

The overall project was inspired by the work of researchers at the Stockholm Environment Institute, described in Towards systemic and contextual priority setting for implementing the 2030 Agenda, a 2018 paper in Sustainability Science by Nina Weitz, Henrik Carlsen, Måns Nilsson, and Kristian Skånberg….(More)”.

As Jakarta floods again, humanitarian chatbots on social media support community-led disaster response


Blog by Petabencana: “On February 20th, #banjir and #JakartaBanjir were the highest trending topics on Twitter Indonesia, as the capital city was inundated for the third major time this year, following particularly heavy rainfall from Friday night (19/2/2021) to Saturday morning (20/02/2021). As Jakarta residents turned to social media to share updates about the flood, they were greeted by “Disaster Bot” – a novel AI-assisted chatbot that monitors social media for posts about disasters and automatically invites users to submit more detailed disaster reports. These crowd-sourced reports are used to map disasters in real-time, on a free and open source website, PetaBencana.id.

As flooding blocked major thoroughfares and toll roads, disrupted commuter lines, and cut off electricity to over 60,000 homes, residents continued to share updates about the flood situation in order to stay alert and make timely decisions about safety and response. Hundreds of residents submitted flood reports to PetaBencana.id, alerting each other about water levels, broken infrastructures and road accessibility. The Jakarta Emergency Management Agency also updated the map with official information about flood affected  areas, and monitored the map to respond to resident needs. PetaBencana.id experienced a 2000% in activity in under 12 hours as residents actively checked the map to understand the flooding situation, avoid flooded areas, and make decisions about safety and response. 

Residents share updates about flood-affected road access through the open source information sharing platform, PetaBencana.id. Thousands of residents used the map to navigate safely as heavy rainfall inundated the city for the third major time this year.

As flooding incidents continue to occur with increasing intensity across the country, community-led information sharing is once again proving its significance in supporting response and planning at multiple scales. …(More)”.

Mapping urban temperature using crowd-sensing data and machine learning


Paper by Marius Zumwald, Benedikt Knüsel, David N.Bresch and Reto Knutti: :”Understanding the patterns of urban temperature a high spatial and temporal resolution is of large importance for urban heat adaptation and mitigation. Machine learning offers promising tools for high-resolution modeling of urban heat, but it requires large amounts of data. Measurements from official weather stations are too sparse but could be complemented by crowd-sensed measurements from citizen weather stations (CWS). Here we present an approach to model urban temperature using the quantile regression forest algorithm and CWS, open government and remote sensing data. The analysis is based on data from 691 sensors in the city of Zurich (Switzerland) during a heat wave using data from for 25-30th June 2019. We trained the model using hourly data from for 25-29th June (n = 71,837) and evaluate the model using data from June 30th (n = 14,105). Based on the model, spatiotemporal temperature maps of 10 × 10 m resolution were produced. We demonstrate that our approach can accurately map urban heat at high spatial and temporal resolution without additional measurement infrastructure. We furthermore critically discuss and spatially map estimated prediction and extrapolation uncertainty. Our approach is able to inform highly localized urban policy and decision-making….(More)”.

Climate TRACE


About: “We exist to make meaningful climate action faster and easier by mobilizing the global tech community—harnessing satellites, artificial intelligence, and collective expertise—to track human-caused emissions to specific sources in real time—independently and publicly.

Climate TRACE aims to drive stronger decision-making on environmental policy, investment, corporate sustainability strategy, and more.

WHAT WE DO

01 Monitor human-caused GHG emissions using cutting-edge technologies such as artificial intelligence, machine learning, and satellite image processing.

02 Collaborate with data scientists and emission experts from an array of industries to bring unprecedented transparency to global pollution monitoring.

03 Partner with leaders from the private and public sectors to share valuable insights in order to drive stronger climate policy and strategy.

04 Provide the necessary tools for anyone anywhere to make better decisions to mitigate and adapt to the impacts from climate change… (More)”

Geospatial Data Market Study


Study by Frontier Economics: “Frontier Economics was commissioned by the Geospatial Commission to carry out a detailed economic study of the size, features and characteristics of the UK geospatial data market. The Geospatial Commission was established within the Cabinet Office in 2018, as an independent, expert committee responsible for setting the UK’s Geospatial Strategy and coordinating public sector geospatial activity. The Geospatial Commission’s aim is to unlock the significant economic, social and environmental opportunities offered by location data. The UK’s Geospatial Strategy (2020) sets out how the UK can unlock the full power of location data and take advantage of the significant economic, social and environmental opportunities offered by location data….

Like many other forms of data, the value of geospatial data is not limited to the data creator or data user. Value from using geospatial data can be subdivided into several different categories, based on who the value accrues to:

Direct use value: where value accrues to users of geospatial data. This could include government using geospatial data to better manage public assets like roadways.

Indirect use value: where value is also derived by indirect beneficiaries who interact with direct users. This could include users of the public assets who benefit from better public service provision.

Spillover use value: value that accrues to others who are not a direct data user or indirect beneficiary. This could, for example, include lower levels of emissions due to improvement management of the road network by government. The benefits of lower emissions are felt by all of society even those who do not use the road network.

As the value from geospatial data does not always accrue to the direct user of the data, there is a risk of underinvestment in geospatial technology and services. Our £6 billion estimate of turnover for a subset of geospatial firms in 2018 does not take account of these wider economic benefits that “spill over” across the UK economy, and generate additional value. As such, the value that geospatial data delivers is likely to be significantly higher than we have estimated and is therefore an area for potential future investment….(More)”.

Google launches new tool to help cities stay cool


Article by Justine Calma: “Google unveiled a tool today that could help cities keep their residents cool by mapping out where trees are needed most. Cities tend to be warmer than surrounding areas because buildings and asphalt trap heat. An easy way to cool metropolitan areas down is to plant more trees in neighborhoods where they’re sparse.

Google’s new Tree Canopy Lab uses aerial imagery and Google’s AI to figure out where every tree is in a city. Tree Canopy Lab puts that information on an interactive map along with additional data on which neighborhoods are more densely populated and are more vulnerable to high temperatures. The hope is that planting new trees in these areas could help cities adapt to a warming world and save lives during heat waves.

Google piloted Tree Canopy Lab in Los Angeles. Data on hundreds more cities is on the way, the company says. City planners interested in using the tool in the future can reach out to Google through a form it posted along with today’s announcement.

“We’ll be able to really home in on where the best strategic investment will be in terms of addressing that urban heat,” says Rachel Malarich, Los Angeles’ first city forest officer.

Google claims that its new tool can save cities like Los Angeles time when it comes to taking inventory of their trees. That’s often done by sending people to survey each block. Los Angeles has also used LIDAR technology to map their urban forest in the past, which uses a laser sensor to detect the trees — but that process was expensive and slow, according to Malarich. Google’s new service, on the other hand, is free to use and will be updated regularly using images the company already takes by plane for Google Maps….(More)”.

Macron’s green democracy experiment gets political


Louise Guillot and Elisa Braun at Politico: “Emmanuel Macron asked 150 ordinary people to help figure out France’s green policies — and now this citizens’ convention is turning into a political problem for the French president.

The Citizens’ Convention on Climate was aimed at calming tensions in the wake of the Yellow Jackets protest movement — which was sparked by a climate tax on fuel — and showing that Macron wasn’t an out-of-touch elitist.

After nine months of deliberations, the convention came up with 149 proposals to slash greenhouse gas emissions this summer. The government has to put some of these measures before the parliament for them to become binding, and a bill is due to be presented in December.

But that’s too slow for many of the convention’s members, who feel the government is back-pedalling on some of the ideas and that Macron has poked fun at them.

Muriel Raulic, a member of the convention, accused Macron of using the body to greenwash his administration.

She supports a moratorium on 5G high-speed mobile technology, which has created some health and environmental fears. Macron has dismissed proponents of the ban as “Amish” — a Christian sect suspicious of technology.

The 150 members wrote an open letter to Macron in mid-October, complaining about a lack of “clear and defined support from the executive, whose positions sometimes appear contradictory,” and to “openly hostile communications” from “certain professional actors.”

Some gathered late last month before the National Assembly to complain they felt used and treated like “guinea pigs” by politicians. In June, they created an association to oversee what the government is doing with their proposals. 

…The government denied it is using the convention to greenwash itself….(More)”.

Poor data on groundwater jeopardizes climate resilience


Rebecca Root at Devex: “A lack of data on groundwater is impeding water management and could jeopardize climate resilience efforts in some places, according to recent research by WaterAid and the HSBC Water Programme.

Groundwater is found underground in gaps between soil, sand, and rock. Over 2.5 million people are thought to depend on groundwater — which has a higher tolerance to droughts than other water sources — for drinking.

The report looked at groundwater security and sustainability in Bangladesh, Ghana, India, Nepal, and Nigeria, where collectively more than 160 million people lack access to clean water close to home. It found that groundwater data tends to be limited — including on issues such as overextraction, pollution, and contamination — leaving little evidence for decision-makers to consider for its management.

“There’s a general lack of information and data … which makes it very hard to manage the resource sustainably,” said Vincent Casey, senior water, sanitation, and hygiene manager for waste at WaterAid…(More)”.

The Practice and Potential of Blockchain Technologies for Extractive Sector Governance


Press Release: “Important questions are being raised about whether blockchain technologies can contribute to solving governance challenges in the mining, oil and gas sectors. This report seeks to begin addressing such questions, with particular reference to current blockchain applications and transparency efforts in the extractive sector.

It summarizes analysis by The Governance Lab (GovLab) at the New York University Tandon School of Engineering and the Natural Resource Governance Institute (NRGI). The study focused in particular on three activity areas: licensing and contracting, corporate registers and beneficial ownership, and commodity trading and supply chains.

Key messages:

  • Blockchain technology could potentially reduce transparency challenges and information asymmetries in certain parts of the extractives value chain. However, stakeholders considering blockchain technologies need a more nuanced understanding of problem definition, value proposition and blockchain attributes to ensure that such interventions could positively impact extractive sector governance.
  • The blockchain field currently lacks design principles, governance best practices, and open data standards that could ensure that the technology helps advance transparency and good governance in the extractive sector. Our analysis offers an initial set of design principles that could act as a starting point for a more targeted approach to the use of blockchain in improving extractives governance.
  • Most blockchain projects are preliminary concepts or pilots, with little demonstration of how to effectively scale up successful experiments, especially in countries with limited resources.
  • Meaningful impact evaluations or peer-reviewed publications that assess impact, including on the implications of blockchain’s emissions footprint, are still lacking. More broadly, a shared research agenda around blockchain could help address questions that are particularly ripe for future research.
  • Transition to a blockchain-enabled system is likely to be smoother and faster in cases when digital records are already available than when a government or company attempts to move from an analog system to one leveraging blockchain.
  • Companies or governments using blockchain are more likely to implement it successfully when they have a firm grasp of the technology, its strengths, its weaknesses, and how it fits into the broader governance landscape. But often these actors are often overly reliant on and empowering of blockchain technology vendors and startups, which can lead to “lock-in”, whereby the market gets stuck with an approach even though market participants may be better off with an alternative.
  • The role played by intermediaries like financial institutions or registrars can determine the success or failure of blockchain applications….(More)”.

The open source movement takes on climate data


Article by Heather Clancy: “…many companies are moving to disclose “climate risk,” although far fewer are moving to actually minimize it. And as those tasked with preparing those reports can attest, the process of gathering the data for them is frustrating and complex, especially as the level of detail desired and required by investors becomes deeper.

That pain point was the inspiration for a new climate data project launched this week that will be spearheaded by the Linux Foundation, the nonprofit host organization for thousands of the most influential open source software and data initiatives in the world such as GitHub. The foundation is central to the evolution of the Linux software that runs in the back offices of most major financial services firms. 

There are four powerful founding members for the new group, the LF Climate Finance Foundation (LFCF): Insurance and asset management company Allianz, cloud software giants Amazon and Microsoft, and data intelligence powerhouse S&P Global. The foundation’s “planning team” includes World Wide Fund for Nature (WWF), Ceres and the Sustainability Account Standards Board (SASB).

The group’s intention is to collaborate on an open source project called the OS-Climate platform, which will include economic and physical risk scenarios that investors, regulators, companies, financial analysts and others can use for their analysis. 

The idea is to create a “public service utility” where certain types of climate data can be accessed easily, then combined with other, more proprietary information that someone might be using for risk analysis, according to Truman Semans, CEO of OS-Climate, who was instrumental in getting the effort off the ground. “There are a whole lot of initiatives out there that address pieces of the puzzle, but no unified platform to allow those to interoperate,” he told me. There are a whole lot of initiatives out there that address pieces of the puzzle, but no unified platform to allow those to interoperate.

Why does this matter? It helps to understand the history of open source software, which was once a thing that many powerful software companies, notably Microsoft, abhorred because they were worried about the financial hit on their intellectual property. Flash forward to today and the open source software movement, “staffed” by literally millions of software developers, is credited with accelerating the creation of common system-level elements so that companies can focus their own resources on solving problems directly related to their business.

In short, this budding effort could make the right data available more quickly, so that businesses — particularly financial institutions — can make better informed decisions.

Or, as Microsoft’s chief intellectual property counsel, Jennifer Yokoyama, observed in the announcement press release: “Addressing climate issues in a meaningful way requires people and organizations to have access to data to better understand the impact of their actions. Opening up and sharing our contribution of significant and relevant sustainability data through the LF Climate Finance Foundation will help advance the financial modeling and understanding of climate change impact — an important step in affecting political change. We’re excited to collaborate with the other founding members and hope additional organizations will join.”…(More)”