Coronavirus: how the pandemic has exposed AI’s limitations


Kathy Peach at The Conversation: “It should have been artificial intelligence’s moment in the sun. With billions of dollars of investment in recent years, AI has been touted as a solution to every conceivable problem. So when the COVID-19 pandemic arrived, a multitude of AI models were immediately put to work.

Some hunted for new compounds that could be used to develop a vaccine, or attempted to improve diagnosis. Some tracked the evolution of the disease, or generated predictions for patient outcomes. Some modelled the number of cases expected given different policy choices, or tracked similarities and differences between regions.

The results, to date, have been largely disappointing. Very few of these projects have had any operational impact – hardly living up to the hype or the billions in investment. At the same time, the pandemic highlighted the fragility of many AI models. From entertainment recommendation systems to fraud detection and inventory management – the crisis has seen AI systems go awry as they struggled to adapt to sudden collective shifts in behaviour.

The unlikely hero

The unlikely hero emerging from the ashes of this pandemic is instead the crowd. Crowds of scientists around the world sharing data and insights faster than ever before. Crowds of local makers manufacturing PPE for hospitals failed by supply chains. Crowds of ordinary people organising through mutual aid groups to look after each other.

COVID-19 has reminded us of just how quickly humans can adapt existing knowledge, skills and behaviours to entirely new situations – something that highly-specialised AI systems just can’t do. At least yet….

In one of the experiments, researchers from the Istituto di Scienze e Tecnologie della Cognizione in Rome studied the use of an AI system designed to reduce social biases in collective decision-making. The AI, which held back information from the group members on what others thought early on, encouraged participants to spend more time evaluating the options by themselves.

The system succeeded in reducing the tendency of people to “follow the herd” by failing to hear diverse or minority views, or challenge assumptions – all of which are criticisms that have been levelled at the British government’s scientific advisory committees throughout the pandemic…(More)”.

A Way Forward: Governing in an Age of Emergence


Paper by UNDP: “…This paper seeks to go beyond mere analysis of the spectrum of problems and risks we face, identifying a portfolio of possibilities (POPs) and articulating a new framework for governance and government. The purpose of these POPs is not to define the future but to challenge, to innovate, to expand the range of politically acceptable policies, and to establish a foundation for the statecraft in the age of risk and uncertainties.

As its name suggests, we recognise that the A Way Forward is and must be one of many pathways to explore the future of governance. It is the beginning of a journey; one on which you are invited to join us to help evolve the provocations into new paradigms and policy options that seek to chart an alternative pathway to governance and statecraft.

A Way Forward is a petition for seeding new transnational alliances based on shared interests and vulnerability. We believe the future will be built across a new constellation of governmental alliances, where innovation in statecraft and governance is achieved collaboratively. Our key objective is to establish a platform to host these transnational discussions, and move us towards the new capabilities that are necessary for statecraft in the age of risk and uncertainty….(More)”.

Narrative Observatory


About: “With initial support from the Bill & Melinda Gates Foundation, we are designing and developing a new purpose-built, multi-disciplinary, cross-institutional data platform to enable the reliable identification, measurement, and tracking of cultural narratives over long time scales across multiple cultural domains and media types, like online news, broadcast television, talk radio, and social media. Designed to provide better understanding of the cultural environment for key social issues, and more effective measurement of efforts to alter these environments, the goal is to help narrative change makers reach smarter strategic decisions and better understand their work’s impact.

We’re starting by looking at narratives around poverty and economic mobility in the U.S. . .(More)

The European data market


European Commission: “It was the first European Data Market study (SMART 2013/0063) contracted by the European Commission in 2013 that made a first attempt to provide facts and figures on the size and trends of the EU data economy by developing a European data market monitoring tool.

The final report of the updated European Data Market (EDM) study (SMART 2016/0063) now presents in detail the results of the final round of measurement of the updated European Data Market Monitoring Tool contracted for the 2017-2020 period.

Designed along a modular structure, as a first pillar of the study, the European Data Market Monitoring Tool is built around a core set of quantitative indicators to provide a series of assessments of the emerging market of data at present, i.e. for the years 2018 through 2020, and with projections to 2025.

The key areas covered by the indicators measured in this report are:

  • The data professionals and the balance between demand and supply of data skills;
  • The data companies and their revenues;
  • The data user companies and their spending for data technologies;
  • The market of digital products and services (“Data market”);
  • The data economy and its impacts on the European economy.
  • Forecast scenarios of all the indicators, based on alternative market trajectories.

Additionally, as a second major work stream, the study also presents a series of descriptive stories providing a complementary view to the one offered by the Monitoring Tool (for example, “How Big Data is driving AI” or “The Secondary Use of Health Data and Data-driven Innovation in the European Healthcare Industry”), adding fresh, real-life information around the quantitative indicators. By focusing on specific issues and aspects of the data market, the stories offer an initial, indicative “catalogue” of good practices of what is happening in the data economy today in Europe and what is likely to affect the development of the EU data economy in the medium term.

Finally, as a third work stream of the study, a landscaping exercise on the EU data ecosystem was carried out together with some community building activities to bring stakeholders together from all segments of the data value chain. The map containing the results of the landscaping of the EU data economy as well as reports from the webinars organised by the study are available on the www.datalandscape.eu website….(More)”.

The Ages of Globalization: Geography, Technology, and Institutions


Book by Jeffrey D. Sachs: “Today’s most urgent problems are fundamentally global. They require nothing less than concerted, planetwide action if we are to secure a long-term future. But humanity’s story has always been on a global scale. In this book, Jeffrey D. Sachs, renowned economist and expert on sustainable development, turns to world history to shed light on how we can meet the challenges and opportunities of the twenty-first century.

Sachs takes readers through a series of seven distinct waves of technological and institutional change, starting with the original settling of the planet by early modern humans through long-distance migration and ending with reflections on today’s globalization. Along the way, he considers how the interplay of geography, technology, and institutions influenced the Neolithic revolution; the role of the horse in the emergence of empires; the spread of large land-based empires in the classical age; the rise of global empires after the opening of sea routes from Europe to Asia and the Americas; and the industrial age. The dynamics of these past waves, Sachs demonstrates, offer fresh perspective on the ongoing processes taking place in our own time—a globalization based on digital technologies. Sachs emphasizes the need for new methods of international governance and cooperation to prevent conflicts and to achieve economic, social, and environmental objectives aligned with sustainable development. The Ages of Globalization is a vital book for all readers aiming to make sense of our rapidly changing world….(More)”.

Community Quality-of-Life Indicators


Book edited by Frank Ridzi, Chantal Stevens and Melanie Davern: “This book offers critical insights into the thriving international field of community indicators, incorporating the experiences of government leaders, philanthropic professionals, community planners and a wide range of academic disciplines. It illuminates the important role of community indicators in diverse settings and the rationale for the development and implementation of these innovative projects.  This book details many of the practical “how to” aspects of the field as well as lessons learned from implementing indicators in practice.

The case studies included here also demonstrate how, using a variety of data applications, leaders of today are monitoring and measuring progress and communities are empowered to make sustainable improvements in their wellbeing. With examples related to the environment, economy, planning, community engagement and health, among others, this book epitomizes the constant innovation, collaborative partnerships and the consummate interdisciplinarity of the community indicators field of today….(More)”.

The next Big Data battlefield: Server Geography


Maroosha Muzaffar at OZY: “At the G-20 summit last June, when Japanese Prime Minister Shinzo Abe introduced a resolution endorsing the free flow of data across borders, India, South Africa and Indonesia refused to sign it. India’s then foreign secretary Vijay Gokhale described data as a “new form of wealth” to explain the country’s reluctance to part with it.

It wasn’t an isolated standoff. President Donald Trump’s trade war with China and tariff battles with India and Europe dominated the global financial discourse in the months before the coronavirus crisis. But the next trade conflict after the pandemic eases is already brewing, and it won’t involve only tariffs on products. It’ll be focused on territorial control of data.

A growing number of emerging economies with giant populations, like China, India, Nigeria, Indonesia and South Africa, are leveraging the markets they offer to demand that foreign firms keep the data they gather from these countries within their borders, and not on servers in the West. That’s leading to rising tensions over “data localization,” especially with the U.S., which has an overall global trade deficit but enjoys a massive trade surplus in digital services — in good measure because of its control over global data, say experts.

Indian Prime Minister Narendra Modi dangled his country’s 1.3 billion-strong market during a visit to the U.S. last September, calling data the “new gold.” China has 13 data localization laws that span all sectors of life — all data on Chinese nationals and infrastructure must be stored within the country. Nigeria has a similar requirement. An Indian government panel has meanwhile recommended that New Delhi do the same…(More)”.

Governing Simulations: Intro to Necroeconomics


Bryan Wolff, Yevheniia Berchul, Yu Gong, Andrey Shevlyakov at Strelka Mag: “French philosopher Michel Foucault defined biopower as the power over bodies, or the social and political techniques to control people’s lives. Cameroonian philosopher Achille Mbembe continued this line of thinking to arrive at necropolitics, the politics of death, or as he phrases it: “contemporary forms of subjugation of life, to the power of death.” COVID-19 has put these powers in sharp relief. Most world-changing events of the twenty-first century have been internalized with the question “where were you?” For example, “where were you when the planes hit?” But the pandemic knows no single universal moment to refer to. It’s become as much a question of when, as of where. “When did you take the pandemic seriously?” Most likely, your answer stands in direct relation to your proximity to death. Whether a critical mass or a specific loss, fatality defined COVID-19’s reality.

For many governments, it wasn’t the absolute count of death, but rather its simulations that made them take action. The United States was one of the last countries holding out on a lockdown until the Imperial College report projected the possibility of two million to four million fatalities in the US alone (if no measures were taken). And these weren’t the only simulations being run. A week into the lockdown, it was wondered aloud whether this was all worth the cost. It was a unique public reveal of the deadly economics—or necroeconomics—that we’re usually insulated from, whether through specialist language games or simply because they’re too grim to face. But ignoring the financialization of our demise doesn’t make it go away. If we are to ever reconsider the systems meant to keep us alive, we’d better get familiar. What better place to start than to see the current crisis through the eyes of one of the most widely used models of death: the one that puts a price on life. It’s called the “Value of a Statistical Life” or VSL..(More)”.

From Idea to Reality: Why We Need an Open Data Policy Lab


Stefaan G. Verhulst at Open Data Policy Lab: “The belief that we are living in a data age — one characterized by unprecedented amounts of data, with unprecedented potential — has become mainstream. We regularly read phrases such as “data is the most valuable commodity in the global economy” or that data provides decision-makers with an “ever-swelling flood of information.”

Without a doubt, there is truth in such statements. But they also leave out a major shortcoming — the fact that much of the most useful data continue to remain inaccessible, hidden in silos, behind digital walls, and in untapped “treasuries.”

For close to a decade, the technology and public interest community have pushed the idea of open data. At its core, open data represents a new paradigm of data availability and access. The movement borrows from the language of open source and is rooted in notions of a “knowledge commons”, a concept developed, among others, by scholars like Nobel Prize winner Elinor Ostrom.

Milestones and Limitations in Open Data

Significant milestones have been achieved in the short history of the open data movement. Around the world, an ever-increasing number of governments at the local, state and national levels now release large datasets for the public’s benefit. For example, New York City requires that all public data be published on a single web portal. The current portal site contains thousands of datasets that fuel projects on topics as diverse as school bullying, sanitation, and police conduct. In California, the Forest Practice Watershed Mapper allows users to track the impact of timber harvesting on aquatic life through the use of the state’s open data. Similarly, Denmark’s Building and Dwelling Register releases address data to the public free of charge, improving transparent property assessment for all interested parties.

A growing number of private companies have also initiated or engaged in “Data Collaborative”projects to leverage their private data toward the public interest. For example, Valassis, a direct-mail marketing company, shared its massive address database with community groups in New Orleans to visualize and track block-by-block repopulation rates after Hurricane Katrina. A wide number of data collaboratives are also currently being launched to respond to the COVID-19 pandemic. Through its COVID-19 Data Collaborative Program, the location-intelligence company Cuebiq is providing researchers access to the company’s data to study, for instance, the impacts of social distancing policies in Italy and New York City. The health technology company Kinsa Health’s US Health Weather initiative is likewise visualizing the rate of fever across the United States using data from its network of Smart Thermometers, thereby providing early indications regarding the location of likely COVID-19 outbreaks.

Yet despite such initiatives, many open data projects (and data collaboratives) remain fledgling — especially those at the state and local level.

Among other issues, the field has trouble scaling projects beyond initial pilots, and many potential stakeholders — private sector and government “owners” of data, as well as public beneficiaries — remain skeptical of open data’s value. In addition, terabytes of potentially transformative data remain inaccessible for re-use. It is absolutely imperative that we continue to make the case to all stakeholders regarding the importance of open data, and of moving it from an interesting idea to an impactful reality. In order to do this, we need a new resource — one that can inform the public and data owners, and that would guide decision-makers on how to achieve open data in a responsible manner, without undermining privacy and other rights.

Purpose of the Open Data Policy Lab

Today, with support from Microsoft and under the counsel of a global advisory board of open data leaders, The GovLab is launching an initiative designed precisely to build such a resource.

Our Open Data Policy Lab will draw on lessons and experiences from around the world to conduct analysis, provide guidance, build community, and take action to accelerate the responsible re-use and opening of data for the benefit of society and the equitable spread of economic opportunity…(More)”.

‘For good measure’: data gaps in a big data world


Paper by Sarah Giest & Annemarie Samuels: “Policy and data scientists have paid ample attention to the amount of data being collected and the challenge for policymakers to use and utilize it. However, far less attention has been paid towards the quality and coverage of this data specifically pertaining to minority groups. The paper makes the argument that while there is seemingly more data to draw on for policymakers, the quality of the data in combination with potential known or unknown data gaps limits government’s ability to create inclusive policies. In this context, the paper defines primary, secondary, and unknown data gaps that cover scenarios of knowingly or unknowingly missing data and how that is potentially compensated through alternative measures.

Based on the review of the literature from various fields and a variety of examples highlighted throughout the paper, we conclude that the big data movement combined with more sophisticated methods in recent years has opened up new opportunities for government to use existing data in different ways as well as fill data gaps through innovative techniques. Focusing specifically on the representativeness of such data, however, shows that data gaps affect the economic opportunities, social mobility, and democratic participation of marginalized groups. The big data movement in policy may thus create new forms of inequality that are harder to detect and whose impact is more difficult to predict….(More)“.