Aliens in Europe. An open approach to involve more people in invasive species detection


Paper by Sven Schade et al: “Amplified by the phenomenon of globalisation, such as increased human mobility and the worldwide shipping of goods, we observe an increasing spread of animals and plants outside their native habitats. A few of these ‘aliens’ have negative impacts on their environment, including threats to local biodiversity, agricultural productivity, and human health. Our work addresses these threats, particularly within the European Union (EU), where a related legal framework has been established. We follow an open and participatory approach that allows more people to share their experiences of invasive alien species (IAS) in their surroundings. Over the past three years, we developed a mobile phone application, together with the underlying data management and validation infrastructure, which allows smartphone users to report a selected list of IAS. We put quality assurance and data integration mechanisms into place that allows the uptake of information into existing official systems in order to make it accessible to the relevant policy-making at EU level.

This article summarises our scientific methodology and technical approach, explains our decisions, and provides an outlook to the future of IAS monitoring involving citizens and utilising the latest technological advancements. Last but not least we emphasise on software design for reuse, within the domain of IAS monitoring, but also for supporting citizen science apps more generally. Whereas much could already be achieved, many scientific, technical and organizational challenges still remain to be addressed before data can be seamlessly shared and integrated. Here, we particularly highlight issues that emerge in an international setting, which involves many different stakeholders….(More)”.

Tackling Climate Change with Machine Learning


Paper by David Rolnick et al: “Climate change is one of the greatest challenges facing humanity, and we, as machine learning experts, may wonder how we can help. Here we describe how machine learning can be a powerful tool in reducing greenhouse gas emissions and helping society adapt to a changing climate. From smart grids to disaster management, we identify high impact problems where existing gaps can be filled by machine learning, in collaboration with other fields. Our recommendations encompass exciting research questions as well as promising business opportunities. We call on the machine learning community to join the global effort against climate change….(More)”.

Behavioral Science and Climate Policy


Chapter by Michael Howlett and Stuti Rawat: “Behavioral science consists of the systematic analysis of processes underlying human behavior through experimentation and observation, drawing on knowledge, research, and methods from a variety of fields such as economics, psychology, and sociology. Because policymaking involves efforts to modify or alter the behavior of policy-takers and centers on the processes of decision-making in government, it has always been concerned with behavioral psychology. Classic studies of decision-making in the field derived their frameworks and concepts from psychology, and the founder of policy sciences, Harold Lasswell, was himself trained as a behavioral political scientist. Hence, it should not be surprising that the use of behavioral science is a feature of many policy areas, including climate change policy.

This is given extra emphasis, however, because climate change policymaking and the rise of climate change as a policy issue coincides with a resurgence in behaviorally inspired policy analysis and design brought about by the development of behavioral economics. Thus efforts to deal with climate change have come into being at a time when behavioral governance has been gaining traction worldwide under the influence of works by, among others, Kahneman and Tversky, Thaler, and Sunstein. Such behavioral governance studies have focused on the psychological and cognitive behavioral processes in individuals and collectives, in order to inform, design, and implement different modes of governing. They have been promoted by policy scholars, including many economists working in the area who prefer its insights to those put forward by classical or neoclassical economics.

In the context of climate change policy, behavioral science plays two key roles—through its use of behaviorally premised policy instruments as new modes of public policy being used or proposed to be used, in conjunction with traditional climate change policy tools; and as a way of understanding some of the barriers to compliance and policy design encountered by governments in combating the “super wicked problem” of climate change. Five kinds of behavioral tools have been found to be most commonly used in relation to climate change policy: provision of information, use of social norms, goal setting, default rules, and framing. A large proportion of behavioral tools has been used in the energy sector, because of its importance in the context of climate change action and the fact that energy consumption is easy to monitor, thereby facilitating impact assessment….(More)”.

The Impact of Citizen Environmental Science in the United States


Paper by George Wyeth, Lee C. Paddock, Alison Parker, Robert L. Glicksman and Jecoliah Williams: “An increasingly sophisticated public, rapid changes in monitoring technology, the ability to process large volumes of data, and social media are increasing the capacity for members of the public and advocacy groups to gather, interpret, and exchange environmental data. This development has the potential to alter the government-centric approach to environmental governance; however, citizen science has had a mixed record in influencing government decisions and actions. This Article reviews the rapid changes that are going on in the field of citizen science and examines what makes citizen science initiatives impactful, as well as the barriers to greater impact. It reports on 10 case studies, and evaluates these to provide findings about the state of citizen science and recommendations on what might be done to increase its influence on environmental decisionmaking….(More)”,

We Need a Data-Rich Picture of What’s Killing the Planet


Clive Thompson at Wired: “…Marine litter isn’t the only hazard whose contours we can’t fully see. The United Nations has 93 indicators to measure the environmental dimensions of “sustainable development,” and amazingly, the UN found that we have little to no data on 68 percent of them—like how rapidly land is being degraded, the rate of ocean acidification, or the trade in poached wildlife. Sometimes this is because we haven’t collected it; in other cases some data exists but hasn’t been shared globally, or it’s in a myriad of incompatible formats. No matter what, we’re flying blind. “And you can’t manage something if you can’t measure it,” says David Jensen, the UN’s head of environmental peacebuilding.

In other words, if we’re going to help the planet heal and adapt, we need a data revolution. We need to build a “digital eco­system for the environment,” as Jensen puts it.

The good news is that we’ve got the tools. If there’s one thing tech excels at (for good and ill), it’s surveillance, right? We live in a world filled with cameras and pocket computers, titanic cloud computing, and the eerily sharp insights of machine learning. And this stuff can be used for something truly worthwhile: studying the planet.

There are already some remarkable cases of tech helping to break through the fog. Consider Global Fishing Watch, a nonprofit that tracks the world’s fishing vessels, looking for overfishing. They use everything from GPS-like signals emitted by ships to satellite infrared imaging of ship lighting, plugged into neural networks. (It’s massive, cloud-scale data: over 60 million data points per day, making the AI more than 90 percent accurate at classifying what type of fishing activity a boat is engaged in.)

“If a vessel is spending its time in an area that has little tuna and a lot of sharks, that’s questionable,” says Brian Sullivan, cofounder of the project and a senior program manager at Google Earth Outreach. Crucially, Global Fishing Watch makes its data open to anyone­­­—so now the National Geographic Society is using it to lobby for new marine preserves, and governments and nonprofits use it to target illicit fishing.

If we want better environmental data, we’ll need for-profit companies with the expertise and high-end sensors to pitch in too. Planet, a firm with an array of 140 satellites, takes daily snapshots of the entire Earth. Customers like insurance and financial firms love that sort of data. (It helps them understand weather and climate risk.) But Planet also offers it to services like Global Forest Watch, which maps deforestation and makes the information available to anyone (like activists who help bust illegal loggers). Meanwhile, Google’s skill in cloud-based data crunching helps illuminate the state of surface water: Google digitized 30 years of measurements from around the globe—extracting some from ancient magnetic tapes—then created an easy-to-use online tool that lets resource-poor countries figure out where their water needs protecting….(More)”.

Can we nudge farmers into saving water? Evidence from a randomised experiment


Paper by Sylvain Chabé-Ferret, Philippe Le Coent, Arnaud Reynaud, Julie Subervie and Daniel Lepercq: “We test whether social comparison nudges can promote water-saving behaviour among farmers as a complement to traditional CAP measures. We conducted a randomised controlled trial among 200 farmers equipped with irrigation smart meters in South-West France. Treated farmers received weekly information on individual and group water consumption over four months. Our results rule out medium to large effect-sizes of the nudge. Moreover, they suggest that the nudge was effective at reducing the consumption of those who irrigate the most, although it appears to have reduced the proportion of those who do not consume water at all….(More)”.

The 100 Questions Initiative: Sourcing 100 questions on key societal challenges that can be answered by data insights


100Q Screenshot

Press Release: “The Governance Lab at the NYU Tandon School of Engineering announced the launch of the 100 Questions Initiative — an effort to identify the most important societal questions whose answers can be found in data and data science if the power of data collaboratives is harnessed.

The initiative, launched with initial support from Schmidt Futures, seeks to address challenges on numerous topics, including migration, climate change, poverty, and the future of work.

For each of these areas and more, the initiative will seek to identify questions that could help unlock the potential of data and data science with the broader goal of fostering positive social, environmental, and economic transformation. These questions will be sourced by leveraging “bilinguals” — practitioners across disciplines from all over the world who possess both domain knowledge and data science expertise.

The 100 Questions Initiative starts by identifying 10 key questions related to migration. These include questions related to the geographies of migration, migrant well-being, enforcement and security, and the vulnerabilities of displaced people. This inaugural effort involves partnerships with the International Organization for Migration (IOM) and the European Commission, both of which will provide subject-matter expertise and facilitation support within the framework of the Big Data for Migration Alliance (BD4M).

“While there have been tremendous efforts to gather and analyze data relevant to many of the world’s most pressing challenges, as a society, we have not taken the time to ensure we’re asking the right questions to unlock the true potential of data to help address these challenges,” said Stefaan Verhulst, co-founder and chief research and development officer of The GovLab. “Unlike other efforts focused on data supply or data science expertise, this project seeks to radically improve the set of questions that, if answered, could transform the way we solve 21st century problems.”

In addition to identifying key questions, the 100 Questions Initiative will also focus on creating new data collaboratives. Data collaboratives are an emerging form of public-private partnership that help unlock the public interest value of previously siloed data. The GovLab has conducted significant research in the value of data collaboration, identifying that inter-sectoral collaboration can both increase access to information (e.g., the vast stores of data held by private companies) as well as unleash the potential of that information to serve the public good….(More)”.

So­cial me­dia data re­veal where vis­it­ors to nature loca­tions provide po­ten­tial be­ne­fits or threats to biodiversity


University of Helsinki: “In a new article published in the journal Science of the Total Environment, a team of researchers assessed global patterns of visitation rates, attractiveness and pressure to more than 12,000 Important Bird and Biodiversity Areas (IBAs), which are sites of international significance for nature conservation, by using geolocated data mined from social media (Twitter and Flickr).

The study found that Important Bird and Biodiversity Areas located in Europe and Asia, and in temperate biomes, had the highest density of social media users. Results also showed that sites of importance for congregatory species, which were also more accessible, more densely populated and provided more tourism facilities, received higher visitation than did sites richer in bird species.

 “Resources in biodiversity conservation are woefully inadequate and novel data sources from social media provide openly available user-generated information about human-nature interactions, at an unprecedented spatio-temporal scale”, says Dr Anna Hausmann from the University of Helsinki, a conservation scientist leading the study. “Our group has been exploring and validating data retrieved from social media to understand people´s preferences for experiencing nature in national parks at a local, national and continental scale”, she continues, “in this study, we expand our analyses at a global level”. …

“Social media content and metadata contain useful information for understanding human-nature interactions in space and time”, says Prof. Tuuli Toivonen, another co-author in the paper and the leader of the Digital Geography Lab at the University of Helsinki. “Social media data can also be used to cross-validate and enrich data collected by conservation organizations”, she continues. The study found that the 17 percent of all Important Bird and Biodiversity Areas (IBA) that were assessed by experts to be under greater human disturbance also had higher density of social media users….(More)”.

Microsoft’s Open Notre Dame initiative calls for sharing of open data in restoration effort


Hamza Jawad at Neowin: “On April 15, a disastrous fire ravaged the famous Notre-Dame cathedral in France. In the wake of the episode, tech companies, such as Apple, announced that they would be donating to help in rebuilding efforts. On the other hand, some companies, like Ubisoft, took a different approach to support the restorations that followed.

A few days ago, Microsoft and Iconem announced the “Open Notre Dame” initiative to contribute towards the restoration of the ‘Lady of Paris’. The open data project is said to help gather and analyze existing documents on the monument, while simultaneously producing and sharing its 3D models. Today, the company has once again detailed the workings of this initiative, along with a call for the sharing of open data to help quicken the restoration efforts….

GitHub will host temporal models of the building, which can then be easily shared to and accessed by various other initiatives in a concerted effort to maintain accuracy as much as possible. Many companies, including Ubisoft, have already provided data that will help form the foundation for these open source models. More details regarding the project can be obtained on the original blog post….(More)”.

Open data could have helped us learn from another mining dam disaster


Paulo A. de Souza Jr. at Nature: “The recent Brumadinho dam disaster in Brazil is an example of infrastructure failure with catastrophic consequences. Over 300 people were reported dead or missing, and nearly 400 more were rescued alive. The environmental impact is massive and difficult to quantify. The frequency of these disasters demonstrates that the current assets for monitoring integrity and generating alerting managers, authorities and the public to ongoing change in tailings are, in many cases, not working as they should. There is also the need for adequate prevention procedures. Monitoring can be perfect, but without timely and appropriate action, it will be useless. Good management therefore requires quality data. Undisputedly, management practices of industrial sites, including audit procedures, must improve, and data and metadata available from preceding accidents should be better used. There is a rich literature available about design, construction, operation, maintenance and decommissioning of tailing facilities. These include guidelines, standards, case studies, technical reports, consultancy and audit practices, and scientific papers. Regulation varies from country to country and in some cases, like Australia and Canada, it is controlled by individual state agencies. There are, however, few datasets available that are shared with the technical and scientific community more globally; particularly for prior incidents. Conspicuously lacking are comprehensive data related to monitoring of large infrastructures such as mining dams.

Today, Scientific Data published a Data Descriptor presenting a dataset obtained from 54 laboratory experiments on the breaching of fluvial dikes because of flow overtopping. (Re)use of such data can help improve our understanding of fundamental processes underpinning industrial infrastructure collapse (e.g., fluvial dike breaching, mining dam failure), and assess the accuracy of numerical models for the prediction of such incidents. This is absolutely essential for better management of floods, mitigation of dam collapses, and similar accidents. The authors propose a framework that could exemplify how data involving similar infrastructure can be stored, shared, published, and reused…(More)”.