France asks its citizens how to meet its climate-change targets


The Economist on “An experiment in consultative democracy”: “A nurse, a roofer, an electrician, a former fireman, a lycée pupil, a photographer, a teacher, a marketing manager, an entrepreneur and a civil servant. Sitting on red velvet benches in a domed art-deco amphitheatre in Paris, they and 140 colleagues are part of an unusual democratic experiment in a famously centralised country. Their mission: to draw up measures to reduce French greenhouse-gas emissions by at least 40% by 2030, in line with an eu target that is otherwise in danger of being missed (and which the European Commission now wants to tighten). Six months ago, none of them had met. Now, they have just one month left to show that they can reinvent the French democratic process—and help save the planet. “It’s our moment,” Sylvain, one of the delegates, tells his colleagues from the podium. “We have the chance to propose something historic.”

On March 6th the “citizens’ climate convention” was due to begin its penultimate three-day sitting, the sixth since it began work last October. The convention is made up of a representative sample of the French population, selected by randomly generated telephone numbers. President Emmanuel Macron devised it in an attempt to calm the country after the gilets jaunes (yellow jackets) crisis of 2018. In response to the demand for less top-down decision-making, he first launched what he grandly called a “great national debate”, which took place a year ago. He also pledged the creation of a citizens’ assembly. It is designed to focus on precisely the conundrum that provoked the original protests against a rise in the carbon tax on motor fuel: how to make green policy palatable, efficient and fair.Already signed up?…(More)”.

Wisdom of stakeholder crowds in complex social–ecological systems


Paper by Payam Aminpour et al: “Sustainable management of natural resources requires adequate scientific knowledge about complex relationships between human and natural systems. Such understanding is difficult to achieve in many contexts due to data scarcity and knowledge limitations.

We explore the potential of harnessing the collective intelligence of resource stakeholders to overcome this challenge. Using a fisheries example, we show that by aggregating the system knowledge held by stakeholders through graphical mental models, a crowd of diverse resource users produces a system model of social–ecological relationships that is comparable to the best scientific understanding.

We show that the averaged model from a crowd of diverse resource users outperforms those of more homogeneous groups. Importantly, however, we find that the averaged model from a larger sample of individuals can perform worse than one constructed from a smaller sample. However, when averaging mental models within stakeholder-specific subgroups and subsequently aggregating across subgroup models, the effect is reversed. Our work identifies an inexpensive, yet robust way to develop scientific understanding of complex social–ecological systems by leveraging the collective wisdom of non-scientist stakeholders…(More)”.

Twitter might have a better read on floods than NOAA


Interview by By Justine Calma: “Frustrated tweets led scientists to believe that tidal floods along the East Coast and Gulf Coast of the US are more annoying than official tide gauges suggest. Half a million geotagged tweets showed researchers that people were talking about disruptively high waters even when government gauges hadn’t recorded tide levels high enough to be considered a flood.

Capturing these reactions on social media can help authorities better understand and address the more subtle, insidious ways that climate change is playing out in peoples’ daily lives. Coastal flooding is becoming a bigger problem as sea levels rise, but a study published recently in the journal Nature Communications suggests that officials aren’t doing a great job of recording that.

The Verge spoke with Frances Moore, lead author of the new study and a professor at the University of California, Davis. This isn’t the first time that she’s turned to Twitter for her climate research. Her previous research also found that people tend to stop reacting to unusual weather after dealing with it for a while — sometimes in as little as two years. Similar data from Twitter has been used to study how people coped with earthquakes and hurricanes…(More)”.

Are these the 20 top multi-stakeholder processes in 2020 to advance a digital ecosystem for the planet?


Paper by David Jensen, Karen Bakker and Christopher Reimer: “As outlined in our recent article, The promise and peril of a digital ecosystem for the planet, we propose that the ongoing digital revolution needs to be harnessed to drive a transformation towards global sustainability, environmental stewardship, and human well-being. Public, private and civil society actors must take deliberate action and collaborate to build a global digital ecosystem for the planet. A digital ecosystem that mobilizes hardware, software and digital infrastructures together with data analytics to generate dynamic, real-time insights that can power various structural transformations are needed to achieve collective sustainability.

The digital revolution must also be used to abolish extreme poverty and reduce inequalities that jeopardize social cohesion and stability. Often, these social inequalities are tied to and overlap with ecological challenges. Ultimately, then, we must do nothing less than direct the digital revolution for planet, people, prosperity and peace.

To achieve this goal, we must embed the vision of a fair digital ecosystem for the planet into all of the key multi-stakeholder processes that are currently unfolding. We aim to do this through two new articles on Medium: a companion article on Building a digital ecosystem for the planet: 20 substantive priorities for 2020, and this one. In the companion article, we identify three primary engagement tracks: system architecture, applications, and governance. Within these three tracks, we outline 20 priorities for the new decade. Building from these priorities, our focus for this article is to identify a preliminary list of the top 20 most important multi-stakeholder processes that we must engage and influence in 2020….(More).

The wisdom of crowds: What smart cities can learn from a dead ox and live fish


Portland State University: “In 1906, Francis Galton was at a country fair where attendees had the opportunity to guess the weight of a dead ox. Galton took the guesses of 787 fair-goers and found that the average guess was only one pound off of the correct weight — even when individual guesses were off base.

This concept, known as “the wisdom of crowds” or “collective intelligence,” has been applied to many situations over the past century, from people estimating the number of jellybeans in a jar to predicting the winners of major sporting events — often with high rates of success. Whatever the problem, the average answer of the crowd seems to be an accurate solution.

But does this also apply to knowledge about systems, such as ecosystems, health care, or cities? Do we always need in-depth scientific inquiries to describe and manage them — or could we leverage crowds?

This question has fascinated Antonie J. Jetter, associate professor of Engineering and Technology Management for many years. Now, there’s an answer. A recent study, which was co-authored by Jetter and published in Nature Sustainability, shows that diverse crowds of local natural resource stakeholders can collectively produce complex environmental models very similar to those of trained experts.

For this study, about 250 anglers, water guards and board members of German fishing clubs were asked to draw connections showing how ecological relationships influence the pike stock from the perspective of the anglers and how factors like nutrients and fishing pressures help determine the number of pike in a freshwater lake ecosystem. The individuals’ drawings — or their so-called mental models — were then mathematically combined into a collective model representing their averaged understanding of the ecosystem and compared with the best scientific knowledge on the same subject.

The result is astonishing. If you combine the ideas from many individual anglers by averaging their mental models, the final outcomes correspond more or less exactly to the scientific knowledge of pike ecology — local knowledge of stakeholders produces results that are in no way inferior to lengthy and expensive scientific studies….(More)”.

Global Fishing Watch: Pooling Data and Expertise to Combat Illegal Fishing


Data Collaborative Case Study by Michelle Winowatan, Andrew Young, and Stefaan Verhulst: “

Global Fishing Watch, originally set up through a collaboration between Oceana, SkyTruth and Google, is an independent nonprofit organization dedicated to advancing responsible stewardship of our oceans through increased transparency in fishing activity and scientific research. Using big data processing and machine learning, Global Fishing Watch visualizes, tracks, and shares data about global fishing activity in near-real time and for free via their public map. To date, the platform tracks approximately 65,000 commercial fishing vessels globally. These insights have been used in a number of academic publications, ocean advocacy efforts, and law enforcement activities.

Data Collaborative Model: Based on the typology of data collaborative practice areas, Global Fishing Watch is an example of the data pooling model of data collaboration, specifically a public data pool. Public data pools co-mingle data assets from multiple data holders — including governments and companies — and make those shared assets available on the web. This approach enabled the data stewards and stakeholders involved in Global Fishing Watch to bring together multiple data streams from both public- and private-sector entities in a single location. This single point of access provides the public and relevant authorities with user-friendly access to actionable, previously fragmented data that can drive efforts to address compliance in fisheries and illegal fishing around the world.

Data Stewardship Approach: Global Fishing Watch also provides a clear illustration of the importance of data stewards. For instance, representatives from Google Earth Outreach, one of the data holders, played an important stewardship role in seeking to connect and coordinate with SkyTruth and Oceana, two important nonprofit environmental actors who were working separately prior to this initiative. The brokering of this partnership helped to bring relevant data assets from the public and private sectors to bear in support of institutional efforts to address the stubborn challenge of illegal fishing.

Read the full case study here.”

Manual of Digital Earth


Book by Huadong Guo, Michael F. Goodchild and Alessandro Annoni: “This open access book offers a summary of the development of Digital Earth over the past twenty years. By reviewing the initial vision of Digital Earth, the evolution of that vision, the relevant key technologies, and the role of Digital Earth in helping people respond to global challenges, this publication reveals how and why Digital Earth is becoming vital for acquiring, processing, analysing and mining the rapidly growing volume of global data sets about the Earth.

The main aspects of Digital Earth covered here include: Digital Earth platforms, remote sensing and navigation satellites, processing and visualizing geospatial information, geospatial information infrastructures, big data and cloud computing, transformation and zooming, artificial intelligence, Internet of Things, and social media. Moreover, the book covers in detail the multi-layered/multi-faceted roles of Digital Earth in response to sustainable development goals, climate changes, and mitigating disasters, the applications of Digital Earth (such as digital city and digital heritage), the citizen science in support of Digital Earth, the economic value of Digital Earth, and so on. This book also reviews the regional and national development of Digital Earth around the world, and discusses the role and effect of education and ethics. Lastly, it concludes with a summary of the challenges and forecasts the future trends of Digital Earth.By sharing case studies and a broad range of general and scientific insights into the science and technology of Digital Earth, this book offers an essential introduction for an ever-growing international audience….(More)”.

AI For Good Is Often Bad


Mark Latonero at Wired: “….Within the last few years, a number of tech companies, from Google to Huawei, have launched their own programs under the AI for Good banner. They deploy technologies like machine-learning algorithms to address critical issues like crime, poverty, hunger, and disease. In May, French president Emmanuel Macron invited about 60 leaders of AI-driven companies, like Facebook’s Mark Zuckerberg, to a Tech for Good Summit in Paris. The same month, the United Nations in Geneva hosted its third annual AI for Global Good Summit sponsored by XPrize. (Disclosure: I have spoken at it twice.) A recent McKinsey report on AI for Social Good provides an analysis of 160 current cases claiming to use AI to address the world’s most pressing and intractable problems.

While AI for good programs often warrant genuine excitement, they should also invite increased scrutiny. Good intentions are not enough when it comes to deploying AI for those in greatest need. In fact, the fanfare around these projects smacks of tech solutionism, which can mask root causes and the risks of experimenting with AI on vulnerable people without appropriate safeguards.

Tech companies that set out to develop a tool for the common good, not only their self-interest, soon face a dilemma: They lack the expertise in the intractable social and humanitarian issues facing much of the world. That’s why companies like Intel have partnered with National Geographic and the Leonardo DiCaprio Foundation on wildlife trafficking. And why Facebook partnered with the Red Cross to find missing people after disasters. IBM’s social-good program alone boasts 19 partnerships with NGOs and government agencies. Partnerships are smart. The last thing society needs is for engineers in enclaves like Silicon Valley to deploy AI tools for global problems they know little about….(More)”.

Thinking About the Commons


Carol M. Rose at the International Journal of the Commons: “This article, originally a speech in the conference, Leçons de Droit Comparé sur les Communs, Sciences-Po, Paris, explores current developments in theoretical thinking about the commons. It keys off contemporary reconsiderations of Garret Hardin’s “Tragedy of the Commons” and Elinor Ostrom’s response to Hardin in Governing the Commons and later work.

Ostrom was among the best-known critics of Hardin’s idea of a “tragedy,” but Ostrom’s own work has also raised some questions in more recent commons literature. One key question is the very uncertain relationship between community-based resource control and democratic rights. A second key question revolves around the understanding of commons on the one hand as limited common regimes, central to Ostrom’s work, or as open access, as espoused by more recent advocates of widespread access to information and communications networks….(More)”.

Study says ‘specific’ weather forecasts can’t be made more than 10 days in advance


Matthew Cappucci at the Washington Post: “Imagine someone telling you the weather forecast for New Year’s Day today, two months in advance, with exact temperature bounds and rainfall to a hundredth of an inch. Sounds too good to be true, yes?

A new study in Science says it’s simply not possible. But just how far can we take a day-by-day forecast?

The practical limit to daily forecasting

“A skillful forecast lead time of midlatitude instantaneous weather is around 10 days, which serves as the practical predictability limit,” according to a study published in April in the Journal of the Atmospheric Sciences.

Those limits aren’t likely to change much anytime soon. Even if scientists had the data they needed and a more perfect understanding of all forecasting’s complexities, skillful forecasts could extend out to about 14 or 15 days only, the 2019 study found, because of the chaotic nature of the atmosphere.

“Two weeks is about right. It’s as close to be the ultimate limit as we can demonstrate,” the study’s lead author told Science Magazine.

The American Meteorological Society agrees. Their statement on the limits of prediction, in place since 2015, states that “presently, forecasts of daily or specific weather conditions do not exhibit useful skill beyond eight days, meaning that their accuracy is low.”


Although the American Meteorological Society strongly advises against issuing specific forecasts beyond eight days, popular weather vendor AccuWeather has, for years, churned out detailed predictions many days further into the future. It initiated 45-day forecasts in 2013, which it extended to 90 days in 2016 — and has been heavily criticized for it….(More)”.