Microsoft’s Open Notre Dame initiative calls for sharing of open data in restoration effort


Hamza Jawad at Neowin: “On April 15, a disastrous fire ravaged the famous Notre-Dame cathedral in France. In the wake of the episode, tech companies, such as Apple, announced that they would be donating to help in rebuilding efforts. On the other hand, some companies, like Ubisoft, took a different approach to support the restorations that followed.

A few days ago, Microsoft and Iconem announced the “Open Notre Dame” initiative to contribute towards the restoration of the ‘Lady of Paris’. The open data project is said to help gather and analyze existing documents on the monument, while simultaneously producing and sharing its 3D models. Today, the company has once again detailed the workings of this initiative, along with a call for the sharing of open data to help quicken the restoration efforts….

GitHub will host temporal models of the building, which can then be easily shared to and accessed by various other initiatives in a concerted effort to maintain accuracy as much as possible. Many companies, including Ubisoft, have already provided data that will help form the foundation for these open source models. More details regarding the project can be obtained on the original blog post….(More)”.

Open data could have helped us learn from another mining dam disaster


Paulo A. de Souza Jr. at Nature: “The recent Brumadinho dam disaster in Brazil is an example of infrastructure failure with catastrophic consequences. Over 300 people were reported dead or missing, and nearly 400 more were rescued alive. The environmental impact is massive and difficult to quantify. The frequency of these disasters demonstrates that the current assets for monitoring integrity and generating alerting managers, authorities and the public to ongoing change in tailings are, in many cases, not working as they should. There is also the need for adequate prevention procedures. Monitoring can be perfect, but without timely and appropriate action, it will be useless. Good management therefore requires quality data. Undisputedly, management practices of industrial sites, including audit procedures, must improve, and data and metadata available from preceding accidents should be better used. There is a rich literature available about design, construction, operation, maintenance and decommissioning of tailing facilities. These include guidelines, standards, case studies, technical reports, consultancy and audit practices, and scientific papers. Regulation varies from country to country and in some cases, like Australia and Canada, it is controlled by individual state agencies. There are, however, few datasets available that are shared with the technical and scientific community more globally; particularly for prior incidents. Conspicuously lacking are comprehensive data related to monitoring of large infrastructures such as mining dams.

Today, Scientific Data published a Data Descriptor presenting a dataset obtained from 54 laboratory experiments on the breaching of fluvial dikes because of flow overtopping. (Re)use of such data can help improve our understanding of fundamental processes underpinning industrial infrastructure collapse (e.g., fluvial dike breaching, mining dam failure), and assess the accuracy of numerical models for the prediction of such incidents. This is absolutely essential for better management of floods, mitigation of dam collapses, and similar accidents. The authors propose a framework that could exemplify how data involving similar infrastructure can be stored, shared, published, and reused…(More)”.

We’ll soon know the exact air pollution from every power plant in the world. That’s huge.


David Roberts at Vox: “A nonprofit artificial intelligence firm called WattTime is going to use satellite imagery to precisely track the air pollution (including carbon emissions) coming out of every single power plant in the world, in real time. And it’s going to make the data public.

This is a very big deal. Poor monitoring and gaming of emissions data have made it difficult to enforce pollution restrictions on power plants. This system promises to effectively eliminate poor monitoring and gaming of emissions data….

The plan is to use data from satellites that make theirs publicly available (like the European Union’s Copernicus network and the US Landsat network), as well as data from a few private companies that charge for their data (like Digital Globe). The data will come from a variety of sensors operating at different wavelengths, including thermal infrared that can detect heat.

The images will be processed by various algorithms to detect signs of emissions. It has already been demonstrated that a great deal of pollution can be tracked simply through identifying visible smoke. WattTime says it can also use infrared imaging to identify heat from smokestack plumes or cooling-water discharge. Sensors that can directly track NO2 emissions are in development, according to WattTime executive director Gavin McCormick.

Between visible smoke, heat, and NO2, WattTime will be able to derive exact, real-time emissions information, including information on carbon emissions, for every power plant in the world. (McCormick says the data may also be used to derive information about water pollutants like nitrates or mercury.)

Google.org, Google’s philanthropic wing, is getting the project off the ground (pardon the pun) with a $1.7 million grant; it was selected through the Google AI Impact Challenge….(More)”.

A weather tech startup wants to do forecasts based on cell phone signals


Douglas Heaven at MIT Technology Review: “On 14 April more snow fell on Chicago than it had in nearly 40 years. Weather services didn’t see it coming: they forecast one or two inches at worst. But when the late winter snowstorm came it caused widespread disruption, dumping enough snow that airlines had to cancel more than 700 flights across all of the city’s airports.

One airline did better than most, however. Instead of relying on the usual weather forecasts, it listened to ClimaCell – a Boston-based “weather tech” start-up that claims it can predict the weather more accurately than anyone else. According to the company, its correct forecast of the severity of the coming snowstorm allowed the airline to better manage its schedules and minimize losses due to delays and diversions. 

Founded in 2015, ClimaCell has spent the last few years developing the technology and business relationships that allow it to tap into millions of signals from cell phones and other wireless devices around the world. It uses the quality of these signals as a proxy for local weather conditions, such as precipitation and air quality. It also analyzes images from street cameras. It is offering a weather forecasting service to subscribers that it claims is 60 percent more accurate than that of existing providers, such as NOAA.

The internet of weather

The approach makes sense, in principle. Other forecasters use proxies, such as radar signals. But by using information from millions of everyday wireless devices, ClimaCell claims it has a far more fine-grained view of most of the globe than other forecasters get from the existing network of weather sensors, which range from ground-based devices to satellites. (ClimaCell also taps into these, too.)…(More)”.

Policies as information carriers: How environmental policies may change beliefs and consequent behavior


Paper by Ann-Kathrin Koessler and Stefanie Engel: “This paper discusses how policy interventions not only alter the legal and financial framework in which an individual is operating, but can also lead to changes in relevant beliefs. We argue that such belief changes in how an individual perceives herself, relevant others, the regulator and/or the activity in question can lead to behavioral changes that were neither intended nor expected when the policy was designed.

In the environmental economics literature, these secondary impacts of conventional policy interventions have not been systematically reviewed. Hence, we intend to raise awareness of these effects. In this paper, we review relevant research from behavioral economics and psychology, and identify and discuss the domains for which beliefs can change. Lastly, we discuss design options with which an undesired change in beliefs can be avoided when a new policy is put into practice….(More)”

How Ireland’s Citizens’ Assembly helped climate action


Blog post by Frances Foley: “..In July 2016, the new government – led by Fine Gael, backed by independents – put forward a bill to establish a national-level Citizens’ Assembly to look at the biggest issues of the day. These included the challenges of an ageing population; the role fixed-term parliaments; referendums; the 8th Amendment on abortion; and climate change.

Citizens from every region, every socio-economic background, each ethnicity and age group and from right across the spectrum of political opinion convened over the course of two weekends between September and November 2017. The issue seemed daunting in scale and complexity, but the participants had been well-briefed and had at their disposal a line up of experts, scientists, advocates and other witnesses who would help them make sense of the material. By the end, citizens had produced a radical series of recommendations which went far beyond what any major Irish party was promising, surprising even the initiators of the process….

As expected, the passage for some of the proposals through the Irish party gauntlet has not been smooth. The 8-hour long debate on increasing the carbon tax, for example, suggests that mixing deliberative and representative democracy still produces conflict and confusion. It is certainly clear that parliaments have to adapt and develop if citizens’ assemblies are ever to find their place in our modern democracies.

But the most encouraging move has been the simple acknowledgement that many of the barriers to implementation lie at the level of governance. The new Climate Action Commission, with a mandate to monitor climate action across government, should act as the governmental guarantor of the vision from the Citizens’ Assembly. Citizens’ proposals have themselves stimulated a review of internal government processes to stop their demands getting mired in party wrangling and government bureaucracy. By their very nature, the success of citizens’ assemblies can also provide an alternative vision of how decisions can be made – and in so doing shame political parties and parliaments into improving their decision-making practices.

Does the Irish Citizens’ Assembly constitute a case of rapid transition? In terms of its breadth, scale and vision, the experiment is impressive. But in terms of speed, deliberative processes are often criticised for being slow, unwieldly and costly. The response to this should be to ask what we’re getting: whilst an Assembly is not the most rapid vehicle for change – most serious processes take several months, if not a couple of years – the results, both in specific outcomes and in cultural or political shifts – can be astounding….

In respect to climate change, this harmony between ends and means is particularly significant. The climate crisis is the most severe collective decision-making challenge of our times, one that demands courage, but also careful thought….(More)”.

Leveraging Big Data for Social Responsibility


Paper by Cynthia Ann Peterson: “Big data has the potential to revolutionize the way social risks are managed by providing enhanced insight to enable more informed actions to be taken. The objective of this paper is to share the approach taken by PETRONAS to leverage big data to enhance its social performance practice, specifically in social risk assessments and grievance mechanism.

The paper will deliberate on the benefits, challenges and opportunities to improve the management of social risk through analytics, and how PETRONAS has taken those factors into consideration in the enhancement of its social risk assessment and grievance mechanism tools. Key considerations such as disaggregation of data, the appropriate leading and lagging indicators and having a human rights lens to data will also be discussed.

Leveraging on big data is still in its early stages in the social risk space, similar with other areas in the oil and gas industry according to research by Wood Mackenzie. Even so, there are several concerns which include; the aggregation of data may result in risks to minority or vulnerable groups not getting surfaced; privacy breaches which violate human rights and potential discrimination due to prescriptive analysis, such as on a community’s propensity to pose certain social risks to projects or operations. Certainly, there are many challenges ahead which need to be considered, including how best to take a human rights approach to using big data.

Nevertheless, harnessing the power of big data will help social risk practitioners turn a high volume of disparate pieces of raw data from grievance mechanisms and social risk assessments into information that can be used to avoid or mitigate risks now and in the future through predictive technology. Consumer and other industries are benefiting from this leverage now, and social performance practitioners in the oil and gas industry can emulate these proven models….(More)”.

Characterizing the cultural niches of North American birds


Justin G. Schuetz and Alison Johnston at PNAS: “Efforts to mitigate the current biodiversity crisis require a better understanding of how and why humans value other species. We use Internet query data and citizen science data to characterize public interest in 621 bird species across the United States. We estimate the relative popularity of different birds by quantifying how frequently people use Google to search for species, relative to the rates at which they are encountered in the environment.

In intraspecific analyses, we also quantify the degree to which Google searches are limited to, or extend beyond, the places in which people encounter each species. The resulting metrics of popularity and geographic specificity of interest allow us to define aspects of relationships between people and birds within a cultural niche space. We then estimate the influence of species traits and socially constructed labels on niche positions to assess the importance of observations and ideas in shaping public interest in birds.

Our analyses show clear effects of migratory strategy, color, degree of association with bird feeders, and, especially, body size on niche position. They also indicate that cultural labels, including “endangered,” “introduced,” and, especially, “team mascot,” are strongly associated with the magnitude and geographic specificity of public interest in birds. Our results provide a framework for exploring complex relationships between humans and other species and enable more informed decision-making across diverse bird conservation strategies and goals….(More)”.

Black Wave: How Networks and Governance Shaped Japan’s 3/11 Disasters


Book by Daniel Aldrich: “Despite the devastation caused by the magnitude 9.0 earthquake and 60-foot tsunami that struck Japan in 2011, some 96% of those living and working in the most disaster-stricken region of Tōhoku made it through. Smaller earthquakes and tsunamis have killed far more people in nearby China and India. What accounts for the exceptionally high survival rate? And why is it that some towns and cities in the Tōhoku region have built back more quickly than others?

Black Wave illuminates two critical factors that had a direct influence on why survival rates varied so much across the Tōhoku region following the 3/11 disasters and why the rebuilding process has also not moved in lockstep across the region. Individuals and communities with stronger networks and better governance, Daniel P. Aldrich shows, had higher survival rates and accelerated recoveries. Less connected communities with fewer such ties faced harder recovery processes and lower survival rates. Beyond the individual and neighborhood levels of survival and recovery, the rebuilding process has varied greatly, as some towns and cities have sought to work independently on rebuilding plans, ignoring recommendations from the national governments and moving quickly to institute their own visions, while others have followed the guidelines offered by Tokyo-based bureaucrats for economic development and rebuilding….(More)”.

This tech tells cities when floods are coming–and what they will destroy


Ben Paynter at FastCompany: “Several years ago, one of the eventual founders of One Concern nearly died in a tragic flood. Today, the company specializes in using artificial intelligence to predict how natural disasters are unfolding in real time on a city-block-level basis, in order to help disaster responders save as many lives as possible….

To fix that, One Concern debuted Flood Concern in late 2018. It creates map-based visualizations of where water surges may hit hardest, up to five days ahead of an impending storm. For cities, that includes not just time-lapse breakdowns of how the water will rise, how fast it could move, and what direction it will be flowing, but also what structures will get swamped or washed away, and how differing mitigation efforts–from levy building to dam releases–will impact each scenario. It’s the winner of Fast Company’s 2019 World Changing Ideas Awards in the AI and Data category.

[Image: One Concern]

So far, Flood Concern has been retroactively tested against events like Hurricane Harvey to show that it could have predicted what areas would be most impacted well ahead of the storm. The company, which was founded in Silicon Valley in 2015, started with one of that region’s pressing threats: earthquakes. It’s since earned contracts with cities like San Francisco, Los Angeles, and Cupertino, as well as private insurance companies….

One Concern’s first offering, dubbed Seismic Concern, takes existing information from satellite images and building permits to figure out what kind of ground structures are built on, and what might happen if they started shaking. If a big one hits, the program can extrapolate from the epicenter to suggest the likeliest places for destruction, and then adjust as more data from things like 911 calls and social media gets factored in….(More)”.