AI For Good Is Often Bad


Mark Latonero at Wired: “….Within the last few years, a number of tech companies, from Google to Huawei, have launched their own programs under the AI for Good banner. They deploy technologies like machine-learning algorithms to address critical issues like crime, poverty, hunger, and disease. In May, French president Emmanuel Macron invited about 60 leaders of AI-driven companies, like Facebook’s Mark Zuckerberg, to a Tech for Good Summit in Paris. The same month, the United Nations in Geneva hosted its third annual AI for Global Good Summit sponsored by XPrize. (Disclosure: I have spoken at it twice.) A recent McKinsey report on AI for Social Good provides an analysis of 160 current cases claiming to use AI to address the world’s most pressing and intractable problems.

While AI for good programs often warrant genuine excitement, they should also invite increased scrutiny. Good intentions are not enough when it comes to deploying AI for those in greatest need. In fact, the fanfare around these projects smacks of tech solutionism, which can mask root causes and the risks of experimenting with AI on vulnerable people without appropriate safeguards.

Tech companies that set out to develop a tool for the common good, not only their self-interest, soon face a dilemma: They lack the expertise in the intractable social and humanitarian issues facing much of the world. That’s why companies like Intel have partnered with National Geographic and the Leonardo DiCaprio Foundation on wildlife trafficking. And why Facebook partnered with the Red Cross to find missing people after disasters. IBM’s social-good program alone boasts 19 partnerships with NGOs and government agencies. Partnerships are smart. The last thing society needs is for engineers in enclaves like Silicon Valley to deploy AI tools for global problems they know little about….(More)”.

Thinking About the Commons


Carol M. Rose at the International Journal of the Commons: “This article, originally a speech in the conference, Leçons de Droit Comparé sur les Communs, Sciences-Po, Paris, explores current developments in theoretical thinking about the commons. It keys off contemporary reconsiderations of Garret Hardin’s “Tragedy of the Commons” and Elinor Ostrom’s response to Hardin in Governing the Commons and later work.

Ostrom was among the best-known critics of Hardin’s idea of a “tragedy,” but Ostrom’s own work has also raised some questions in more recent commons literature. One key question is the very uncertain relationship between community-based resource control and democratic rights. A second key question revolves around the understanding of commons on the one hand as limited common regimes, central to Ostrom’s work, or as open access, as espoused by more recent advocates of widespread access to information and communications networks….(More)”.

Study says ‘specific’ weather forecasts can’t be made more than 10 days in advance


Matthew Cappucci at the Washington Post: “Imagine someone telling you the weather forecast for New Year’s Day today, two months in advance, with exact temperature bounds and rainfall to a hundredth of an inch. Sounds too good to be true, yes?

A new study in Science says it’s simply not possible. But just how far can we take a day-by-day forecast?

The practical limit to daily forecasting

“A skillful forecast lead time of midlatitude instantaneous weather is around 10 days, which serves as the practical predictability limit,” according to a study published in April in the Journal of the Atmospheric Sciences.

Those limits aren’t likely to change much anytime soon. Even if scientists had the data they needed and a more perfect understanding of all forecasting’s complexities, skillful forecasts could extend out to about 14 or 15 days only, the 2019 study found, because of the chaotic nature of the atmosphere.

“Two weeks is about right. It’s as close to be the ultimate limit as we can demonstrate,” the study’s lead author told Science Magazine.

The American Meteorological Society agrees. Their statement on the limits of prediction, in place since 2015, states that “presently, forecasts of daily or specific weather conditions do not exhibit useful skill beyond eight days, meaning that their accuracy is low.”


Although the American Meteorological Society strongly advises against issuing specific forecasts beyond eight days, popular weather vendor AccuWeather has, for years, churned out detailed predictions many days further into the future. It initiated 45-day forecasts in 2013, which it extended to 90 days in 2016 — and has been heavily criticized for it….(More)”.

Finland’s model in utilising forest data


Report by Matti Valonen et al: “The aim of this study is to depict the Finnish Forest Centre’s Metsään.fiwebsite’s background, objectives and implementation and to assess its needs for development and future prospects. The Metsään.fi-service included in the Metsään.fi-website is a free e-service for forest owners and corporate actors (companies, associations and service providers) in the forest sector, which aim is to support active decision-making among forest owners by offering forest resource data and maps on forest properties, by making contacts with the authorities easier through online services and to act as a platform for offering forest services, among other things.

In addition to the Metsään.fi-service, the website includes open forest data services that offer the users national forest resource data that is not linked with personal information.

Private forests are in a key position as raw material sources for traditional and new forest-based bioeconomy. In addition to wood material, the forests produce non-timber forest products (for example berries and mushrooms), opportunities for recreation and other ecosystem services.

Private forests cover roughly 60 percent of forest land, but about 80 percent of the domestic wood used by forest industry. In 2017 the value of the forest industry production was 21 billion euros, which is a fifth of the entire industry production value in Finland. The forest industry export in 2017 was worth about 12 billion euros, which covers a fifth of the entire export of goods. Therefore, the forest sector is important for Finland’s national economy…(More)”.

Internet of Water


About: “Water is the essence of life and vital to the well-being of every person, economy, and ecosystem on the planet. But around the globe and here in the United States, water challenges are mounting as climate change, population growth, and other drivers of water stress increase. Many of these challenges are regional in scope and larger than any one organization (or even states), such as the depletion of multi-state aquifers, basin-scale flooding, or the wide-spread accumulation of nutrients leading to dead zones. Much of the infrastructure built to address these problems decades ago, including our data infrastructure, are struggling to meet these challenges. Much of our water data exists in paper formats unique to the organization collecting the data. Often, these organizations existed long before the personal computer was created (1975) or the internet became mainstream (mid 1990’s). As organizations adopted data infrastructure in the late 1990’s, it was with the mindset of “normal infrastructure” at the time. It was built to last for decades, rather than adapt with rapid technological changes. 

New water data infrastructure with new technologies that enable data to flow seamlessly between users and generate information for real-time management are needed to meet our growing water challenges. Decision-makers need accurate, timely data to understand current conditions, identify sustainability problems, illuminate possible solutions, track progress, and adapt along the way. Stakeholders need easy-to-understand metrics of water conditions so they can make sure managers and policymakers protect the environment and the public’s water supplies. The water community needs to continually improve how they manage this complex resource by using data and communicating information to support decision-making. In short, a sustained effort is required to accelerate the development of open data and information systems to support sustainable water resources management. The Internet of Water (IoW) is designed to be just such an effort….(More)”.

Massive Citizen Science Effort Seeks to Survey the Entire Great Barrier Reef


Jessica Wynne Lockhart at Smithsonian: “In August, marine biologists Johnny Gaskell and Peter Mumby and a team of researchers boarded a boat headed into unknown waters off the coasts of Australia. For 14 long hours, they ploughed over 200 nautical miles, a Google Maps cache as their only guide. Just before dawn, they arrived at their destination of a previously uncharted blue hole—a cavernous opening descending through the seafloor.

After the rough night, Mumby was rewarded with something he hadn’t seen in his 30-year career. The reef surrounding the blue hole had nearly 100 percent healthy coral cover. Such a find is rare in the Great Barrier Reef, where coral bleaching events in 2016 and 2017 led to headlines proclaiming the reef “dead.”

“It made me think, ‘this is the story that people need to hear,’” Mumby says.

The expedition from Daydream Island off the coast of Queensland was a pilot program to test the methodology for the Great Reef Census, a citizen science project headed by Andy Ridley, founder of the annual conservation event Earth Hour. His latest organization, Citizens of the Great Barrier Reef, has set the ambitious goal of surveying the entire 1,400-mile-long reef system in 2020…(More)”.

The promise and peril of a digital ecosystem for the planet


Blog post by Jillian Campbell and David E Jensen: “A range of frontier and digital technologies have dramatically boosted the ways in which we can monitor the health of our planet. And sustain our future on it (Figure 1).

Figure 1. A range of frontier an digital technologies can be combined to monitor our planet and the sustainable use of natural resources (1)

If we can leverage this technology effectively, we will be able to assess and predict risks, increase transparency and accountability in the management of natural resources and inform markets as well as consumer choice. These actions are all required if we are to stand a better chance of achieving the Sustainable Development Goals (SDGs).

However, for this vision to become a reality, public and private sector actors must take deliberate action and collaborate to build a global digital ecosystem for the planet — one consisting of data, infrastructure, rapid analytics, and real-time insights. We are now at a pivotal moment in the history of our stewardship of this planet. A “tipping point” of sorts. And in order to guide the political action which is required to counter the speed, scope and severity of the environmental and climate crises, we must acquire and deploy these data sets and frontier technologies. Doing so can fundamentally change our economic trajectory and underpin a sustainable future.

This article shows how such a global digital ecosystem for the planet can be achieved — as well as what we risk if we do not take decisive action within the next 12 months….(More)”.

Sharing Private Data for Public Good


Stefaan G. Verhulst at Project Syndicate: “After Hurricane Katrina struck New Orleans in 2005, the direct-mail marketing company Valassis shared its database with emergency agencies and volunteers to help improve aid delivery. In Santiago, Chile, analysts from Universidad del Desarrollo, ISI Foundation, UNICEF, and the GovLab collaborated with Telefónica, the city’s largest mobile operator, to study gender-based mobility patterns in order to design a more equitable transportation policy. And as part of the Yale University Open Data Access project, health-care companies Johnson & Johnson, Medtronic, and SI-BONE give researchers access to previously walled-off data from 333 clinical trials, opening the door to possible new innovations in medicine.

These are just three examples of “data collaboratives,” an emerging form of partnership in which participants exchange data for the public good. Such tie-ups typically involve public bodies using data from corporations and other private-sector entities to benefit society. But data collaboratives can help companies, too – pharmaceutical firms share data on biomarkers to accelerate their own drug-research efforts, for example. Data-sharing initiatives also have huge potential to improve artificial intelligence (AI). But they must be designed responsibly and take data-privacy concerns into account.

Understanding the societal and business case for data collaboratives, as well as the forms they can take, is critical to gaining a deeper appreciation the potential and limitations of such ventures. The GovLab has identified over 150 data collaboratives spanning continents and sectors; they include companies such as Air FranceZillow, and Facebook. Our research suggests that such partnerships can create value in three main ways….(More)”.

Aliens in Europe. An open approach to involve more people in invasive species detection


Paper by Sven Schade et al: “Amplified by the phenomenon of globalisation, such as increased human mobility and the worldwide shipping of goods, we observe an increasing spread of animals and plants outside their native habitats. A few of these ‘aliens’ have negative impacts on their environment, including threats to local biodiversity, agricultural productivity, and human health. Our work addresses these threats, particularly within the European Union (EU), where a related legal framework has been established. We follow an open and participatory approach that allows more people to share their experiences of invasive alien species (IAS) in their surroundings. Over the past three years, we developed a mobile phone application, together with the underlying data management and validation infrastructure, which allows smartphone users to report a selected list of IAS. We put quality assurance and data integration mechanisms into place that allows the uptake of information into existing official systems in order to make it accessible to the relevant policy-making at EU level.

This article summarises our scientific methodology and technical approach, explains our decisions, and provides an outlook to the future of IAS monitoring involving citizens and utilising the latest technological advancements. Last but not least we emphasise on software design for reuse, within the domain of IAS monitoring, but also for supporting citizen science apps more generally. Whereas much could already be achieved, many scientific, technical and organizational challenges still remain to be addressed before data can be seamlessly shared and integrated. Here, we particularly highlight issues that emerge in an international setting, which involves many different stakeholders….(More)”.

Tackling Climate Change with Machine Learning


Paper by David Rolnick et al: “Climate change is one of the greatest challenges facing humanity, and we, as machine learning experts, may wonder how we can help. Here we describe how machine learning can be a powerful tool in reducing greenhouse gas emissions and helping society adapt to a changing climate. From smart grids to disaster management, we identify high impact problems where existing gaps can be filled by machine learning, in collaboration with other fields. Our recommendations encompass exciting research questions as well as promising business opportunities. We call on the machine learning community to join the global effort against climate change….(More)”.