Are these the 20 top multi-stakeholder processes in 2020 to advance a digital ecosystem for the planet?


Paper by David Jensen, Karen Bakker and Christopher Reimer: “As outlined in our recent article, The promise and peril of a digital ecosystem for the planet, we propose that the ongoing digital revolution needs to be harnessed to drive a transformation towards global sustainability, environmental stewardship, and human well-being. Public, private and civil society actors must take deliberate action and collaborate to build a global digital ecosystem for the planet. A digital ecosystem that mobilizes hardware, software and digital infrastructures together with data analytics to generate dynamic, real-time insights that can power various structural transformations are needed to achieve collective sustainability.

The digital revolution must also be used to abolish extreme poverty and reduce inequalities that jeopardize social cohesion and stability. Often, these social inequalities are tied to and overlap with ecological challenges. Ultimately, then, we must do nothing less than direct the digital revolution for planet, people, prosperity and peace.

To achieve this goal, we must embed the vision of a fair digital ecosystem for the planet into all of the key multi-stakeholder processes that are currently unfolding. We aim to do this through two new articles on Medium: a companion article on Building a digital ecosystem for the planet: 20 substantive priorities for 2020, and this one. In the companion article, we identify three primary engagement tracks: system architecture, applications, and governance. Within these three tracks, we outline 20 priorities for the new decade. Building from these priorities, our focus for this article is to identify a preliminary list of the top 20 most important multi-stakeholder processes that we must engage and influence in 2020….(More).

The wisdom of crowds: What smart cities can learn from a dead ox and live fish


Portland State University: “In 1906, Francis Galton was at a country fair where attendees had the opportunity to guess the weight of a dead ox. Galton took the guesses of 787 fair-goers and found that the average guess was only one pound off of the correct weight — even when individual guesses were off base.

This concept, known as “the wisdom of crowds” or “collective intelligence,” has been applied to many situations over the past century, from people estimating the number of jellybeans in a jar to predicting the winners of major sporting events — often with high rates of success. Whatever the problem, the average answer of the crowd seems to be an accurate solution.

But does this also apply to knowledge about systems, such as ecosystems, health care, or cities? Do we always need in-depth scientific inquiries to describe and manage them — or could we leverage crowds?

This question has fascinated Antonie J. Jetter, associate professor of Engineering and Technology Management for many years. Now, there’s an answer. A recent study, which was co-authored by Jetter and published in Nature Sustainability, shows that diverse crowds of local natural resource stakeholders can collectively produce complex environmental models very similar to those of trained experts.

For this study, about 250 anglers, water guards and board members of German fishing clubs were asked to draw connections showing how ecological relationships influence the pike stock from the perspective of the anglers and how factors like nutrients and fishing pressures help determine the number of pike in a freshwater lake ecosystem. The individuals’ drawings — or their so-called mental models — were then mathematically combined into a collective model representing their averaged understanding of the ecosystem and compared with the best scientific knowledge on the same subject.

The result is astonishing. If you combine the ideas from many individual anglers by averaging their mental models, the final outcomes correspond more or less exactly to the scientific knowledge of pike ecology — local knowledge of stakeholders produces results that are in no way inferior to lengthy and expensive scientific studies….(More)”.

Global Fishing Watch: Pooling Data and Expertise to Combat Illegal Fishing


Data Collaborative Case Study by Michelle Winowatan, Andrew Young, and Stefaan Verhulst: “

Global Fishing Watch, originally set up through a collaboration between Oceana, SkyTruth and Google, is an independent nonprofit organization dedicated to advancing responsible stewardship of our oceans through increased transparency in fishing activity and scientific research. Using big data processing and machine learning, Global Fishing Watch visualizes, tracks, and shares data about global fishing activity in near-real time and for free via their public map. To date, the platform tracks approximately 65,000 commercial fishing vessels globally. These insights have been used in a number of academic publications, ocean advocacy efforts, and law enforcement activities.

Data Collaborative Model: Based on the typology of data collaborative practice areas, Global Fishing Watch is an example of the data pooling model of data collaboration, specifically a public data pool. Public data pools co-mingle data assets from multiple data holders — including governments and companies — and make those shared assets available on the web. This approach enabled the data stewards and stakeholders involved in Global Fishing Watch to bring together multiple data streams from both public- and private-sector entities in a single location. This single point of access provides the public and relevant authorities with user-friendly access to actionable, previously fragmented data that can drive efforts to address compliance in fisheries and illegal fishing around the world.

Data Stewardship Approach: Global Fishing Watch also provides a clear illustration of the importance of data stewards. For instance, representatives from Google Earth Outreach, one of the data holders, played an important stewardship role in seeking to connect and coordinate with SkyTruth and Oceana, two important nonprofit environmental actors who were working separately prior to this initiative. The brokering of this partnership helped to bring relevant data assets from the public and private sectors to bear in support of institutional efforts to address the stubborn challenge of illegal fishing.

Read the full case study here.”

Manual of Digital Earth


Book by Huadong Guo, Michael F. Goodchild and Alessandro Annoni: “This open access book offers a summary of the development of Digital Earth over the past twenty years. By reviewing the initial vision of Digital Earth, the evolution of that vision, the relevant key technologies, and the role of Digital Earth in helping people respond to global challenges, this publication reveals how and why Digital Earth is becoming vital for acquiring, processing, analysing and mining the rapidly growing volume of global data sets about the Earth.

The main aspects of Digital Earth covered here include: Digital Earth platforms, remote sensing and navigation satellites, processing and visualizing geospatial information, geospatial information infrastructures, big data and cloud computing, transformation and zooming, artificial intelligence, Internet of Things, and social media. Moreover, the book covers in detail the multi-layered/multi-faceted roles of Digital Earth in response to sustainable development goals, climate changes, and mitigating disasters, the applications of Digital Earth (such as digital city and digital heritage), the citizen science in support of Digital Earth, the economic value of Digital Earth, and so on. This book also reviews the regional and national development of Digital Earth around the world, and discusses the role and effect of education and ethics. Lastly, it concludes with a summary of the challenges and forecasts the future trends of Digital Earth.By sharing case studies and a broad range of general and scientific insights into the science and technology of Digital Earth, this book offers an essential introduction for an ever-growing international audience….(More)”.

AI For Good Is Often Bad


Mark Latonero at Wired: “….Within the last few years, a number of tech companies, from Google to Huawei, have launched their own programs under the AI for Good banner. They deploy technologies like machine-learning algorithms to address critical issues like crime, poverty, hunger, and disease. In May, French president Emmanuel Macron invited about 60 leaders of AI-driven companies, like Facebook’s Mark Zuckerberg, to a Tech for Good Summit in Paris. The same month, the United Nations in Geneva hosted its third annual AI for Global Good Summit sponsored by XPrize. (Disclosure: I have spoken at it twice.) A recent McKinsey report on AI for Social Good provides an analysis of 160 current cases claiming to use AI to address the world’s most pressing and intractable problems.

While AI for good programs often warrant genuine excitement, they should also invite increased scrutiny. Good intentions are not enough when it comes to deploying AI for those in greatest need. In fact, the fanfare around these projects smacks of tech solutionism, which can mask root causes and the risks of experimenting with AI on vulnerable people without appropriate safeguards.

Tech companies that set out to develop a tool for the common good, not only their self-interest, soon face a dilemma: They lack the expertise in the intractable social and humanitarian issues facing much of the world. That’s why companies like Intel have partnered with National Geographic and the Leonardo DiCaprio Foundation on wildlife trafficking. And why Facebook partnered with the Red Cross to find missing people after disasters. IBM’s social-good program alone boasts 19 partnerships with NGOs and government agencies. Partnerships are smart. The last thing society needs is for engineers in enclaves like Silicon Valley to deploy AI tools for global problems they know little about….(More)”.

Thinking About the Commons


Carol M. Rose at the International Journal of the Commons: “This article, originally a speech in the conference, Leçons de Droit Comparé sur les Communs, Sciences-Po, Paris, explores current developments in theoretical thinking about the commons. It keys off contemporary reconsiderations of Garret Hardin’s “Tragedy of the Commons” and Elinor Ostrom’s response to Hardin in Governing the Commons and later work.

Ostrom was among the best-known critics of Hardin’s idea of a “tragedy,” but Ostrom’s own work has also raised some questions in more recent commons literature. One key question is the very uncertain relationship between community-based resource control and democratic rights. A second key question revolves around the understanding of commons on the one hand as limited common regimes, central to Ostrom’s work, or as open access, as espoused by more recent advocates of widespread access to information and communications networks….(More)”.

Study says ‘specific’ weather forecasts can’t be made more than 10 days in advance


Matthew Cappucci at the Washington Post: “Imagine someone telling you the weather forecast for New Year’s Day today, two months in advance, with exact temperature bounds and rainfall to a hundredth of an inch. Sounds too good to be true, yes?

A new study in Science says it’s simply not possible. But just how far can we take a day-by-day forecast?

The practical limit to daily forecasting

“A skillful forecast lead time of midlatitude instantaneous weather is around 10 days, which serves as the practical predictability limit,” according to a study published in April in the Journal of the Atmospheric Sciences.

Those limits aren’t likely to change much anytime soon. Even if scientists had the data they needed and a more perfect understanding of all forecasting’s complexities, skillful forecasts could extend out to about 14 or 15 days only, the 2019 study found, because of the chaotic nature of the atmosphere.

“Two weeks is about right. It’s as close to be the ultimate limit as we can demonstrate,” the study’s lead author told Science Magazine.

The American Meteorological Society agrees. Their statement on the limits of prediction, in place since 2015, states that “presently, forecasts of daily or specific weather conditions do not exhibit useful skill beyond eight days, meaning that their accuracy is low.”


Although the American Meteorological Society strongly advises against issuing specific forecasts beyond eight days, popular weather vendor AccuWeather has, for years, churned out detailed predictions many days further into the future. It initiated 45-day forecasts in 2013, which it extended to 90 days in 2016 — and has been heavily criticized for it….(More)”.

Finland’s model in utilising forest data


Report by Matti Valonen et al: “The aim of this study is to depict the Finnish Forest Centre’s Metsään.fiwebsite’s background, objectives and implementation and to assess its needs for development and future prospects. The Metsään.fi-service included in the Metsään.fi-website is a free e-service for forest owners and corporate actors (companies, associations and service providers) in the forest sector, which aim is to support active decision-making among forest owners by offering forest resource data and maps on forest properties, by making contacts with the authorities easier through online services and to act as a platform for offering forest services, among other things.

In addition to the Metsään.fi-service, the website includes open forest data services that offer the users national forest resource data that is not linked with personal information.

Private forests are in a key position as raw material sources for traditional and new forest-based bioeconomy. In addition to wood material, the forests produce non-timber forest products (for example berries and mushrooms), opportunities for recreation and other ecosystem services.

Private forests cover roughly 60 percent of forest land, but about 80 percent of the domestic wood used by forest industry. In 2017 the value of the forest industry production was 21 billion euros, which is a fifth of the entire industry production value in Finland. The forest industry export in 2017 was worth about 12 billion euros, which covers a fifth of the entire export of goods. Therefore, the forest sector is important for Finland’s national economy…(More)”.

Internet of Water


About: “Water is the essence of life and vital to the well-being of every person, economy, and ecosystem on the planet. But around the globe and here in the United States, water challenges are mounting as climate change, population growth, and other drivers of water stress increase. Many of these challenges are regional in scope and larger than any one organization (or even states), such as the depletion of multi-state aquifers, basin-scale flooding, or the wide-spread accumulation of nutrients leading to dead zones. Much of the infrastructure built to address these problems decades ago, including our data infrastructure, are struggling to meet these challenges. Much of our water data exists in paper formats unique to the organization collecting the data. Often, these organizations existed long before the personal computer was created (1975) or the internet became mainstream (mid 1990’s). As organizations adopted data infrastructure in the late 1990’s, it was with the mindset of “normal infrastructure” at the time. It was built to last for decades, rather than adapt with rapid technological changes. 

New water data infrastructure with new technologies that enable data to flow seamlessly between users and generate information for real-time management are needed to meet our growing water challenges. Decision-makers need accurate, timely data to understand current conditions, identify sustainability problems, illuminate possible solutions, track progress, and adapt along the way. Stakeholders need easy-to-understand metrics of water conditions so they can make sure managers and policymakers protect the environment and the public’s water supplies. The water community needs to continually improve how they manage this complex resource by using data and communicating information to support decision-making. In short, a sustained effort is required to accelerate the development of open data and information systems to support sustainable water resources management. The Internet of Water (IoW) is designed to be just such an effort….(More)”.

Massive Citizen Science Effort Seeks to Survey the Entire Great Barrier Reef


Jessica Wynne Lockhart at Smithsonian: “In August, marine biologists Johnny Gaskell and Peter Mumby and a team of researchers boarded a boat headed into unknown waters off the coasts of Australia. For 14 long hours, they ploughed over 200 nautical miles, a Google Maps cache as their only guide. Just before dawn, they arrived at their destination of a previously uncharted blue hole—a cavernous opening descending through the seafloor.

After the rough night, Mumby was rewarded with something he hadn’t seen in his 30-year career. The reef surrounding the blue hole had nearly 100 percent healthy coral cover. Such a find is rare in the Great Barrier Reef, where coral bleaching events in 2016 and 2017 led to headlines proclaiming the reef “dead.”

“It made me think, ‘this is the story that people need to hear,’” Mumby says.

The expedition from Daydream Island off the coast of Queensland was a pilot program to test the methodology for the Great Reef Census, a citizen science project headed by Andy Ridley, founder of the annual conservation event Earth Hour. His latest organization, Citizens of the Great Barrier Reef, has set the ambitious goal of surveying the entire 1,400-mile-long reef system in 2020…(More)”.