Resilience in the Digital Age


Book edited by Fred S. Roberts and Igor A. Sheremet: “The growth of a global digital economy has enabled rapid communication, instantaneous movement of funds, and availability of vast amounts of information. With this come challenges such as the vulnerability of digitalized sociotechnological systems (STSs) to destructive events (earthquakes, disease events, terrorist attacks). Similar issues arise for disruptions to complex linked natural and social systems (from changing climates, evolving urban environments, etc.). This book explores new approaches to the resilience of sociotechnological and natural-social systems in a digital world of big data, extraordinary computing capacity, and rapidly developing methods of Artificial Intelligence….

The world-wide COVID-19 pandemic illustrates the vulnerability of our healthcare systems, supply chains, and social infrastructure, and confronts our notions of what makes a system resilient. We have found that use of AI tools can lead to problems when unexpected events occur. On the other hand, the vast amounts of data available from sensors, satellite images, social media, etc. can also be used to make modern systems more resilient.

Papers in the book explore disruptions of complex networks and algorithms that minimize departure from a previous state after a disruption; introduce a multigrammatical framework for the technological and resource bases of today’s large-scale industrial systems and the transformations resulting from disruptive events; and explain how robotics can enhance pre-emptive measures or post-disaster responses to increase resiliency. Other papers explore current directions in data processing and handling and principles of FAIRness in data; how the availability of large amounts of data can aid in the development of resilient STSs and challenges to overcome in doing so. The book also addresses interactions between humans and built environments, focusing on how AI can inform today’s smart and connected buildings and make them resilient, and how AI tools can increase resilience to misinformation and its dissemination….(More)”.

New approach to data is a great opportunity for the UK post-Brexit


Oliver Dowden at the Financial Times: “As you read this, thousands of people are receiving a message that will change their lives: a simple email or text, inviting them to book their Covid jab. But what has powered the UK’s remarkable vaccine rollout isn’t just our NHS, but the data that sits underneath it — from the genetic data used to develop the vaccine right through to the personal health data enabling that “ping” on their smartphone.

After years of seeing data solely through the lens of risk, Covid-19 has taught us just how much we have to lose when we don’t use it.

As I launch the competition to find the next Information Commissioner, I want to set a bold new approach that capitalises on all we’ve learnt during the pandemic, which forced us to share data quickly, efficiently and responsibly for the public good. It is one that no longer sees data as a threat, but as the great opportunity of our time.

Until now, the conversation about data has revolved around privacy — and with good reason. A person’s digital footprint can tell you not just vital statistics like age and gender, but their personal habits.

Our first priority is securing this valuable personal information. The UK has a long and proud tradition of defending privacy, and a commitment to maintaining world-class data protection standards now that we’re outside the EU. That was recognised last week in the bloc’s draft decisions on the ‘adequacy’ of our data protection rules — the agreement that data can keep flowing freely between the EU and UK.

We fully intend to maintain those world-class standards. But to do so, we do not need to copy and paste the EU’s rule book, the General Data Protection Regulation (GDPR), word-for-word. Countries as diverse as Israel and Uruguay have successfully secured adequacy with Brussels despite having their own data regimes. Not all of those were identical to GDPR, but equal doesn’t have to mean the same. The EU doesn’t hold the monopoly on data protection.

So, having come a long way in learning how to manage data’s risks, the UK is going to start making more of its opportunities….(More)”.

Surveillance and the ‘New Normal’ of Covid-19: Public Health, Data, and Justice


Report by the Social Science Research Council: “The Covid-19 pandemic has dramatically altered the way nations around the world use technology in public health. As the virus spread globally, some nations responded by closing businesses, shuttering schools, limiting gatherings, and banning travel. Many also deployed varied technological tools and systems to track virus exposure, monitor outbreaks, and aggregate hospital data.

Some regions are still grappling with crisis-level conditions, and others are struggling to navigate the complexities of vaccine rollouts. Amid the upheavals, communities are adjusting to a new normal, in which mask-wearing has become as commonplace as seatbelt use and digital temperature checks are a routine part of entering public buildings.

Even as the frenzy of emergency responses begins to subside, the emergent forms of surveillance that have accompanied this new normal persist. As a consequence, societies face new questions about how to manage the monitoring systems created in response to the virus, what processes are required in order to immunize populations, and what new norms the systems have generated. How they answer these questions will have long-term impacts on civil liberties, governance, and the role of technology in society. The systems implemented amid the public health emergency could jeopardize individual freedoms and exacerbate harms to already vulnerable groups, particularly if they are adapted to operate as permanent social management tools. At the same time, growing public awareness about the impact of public health technologies could also provide a catalyst for strengthening democratic engagement and demonstrating the urgency of improving governance systems. As the world transitions in and out of pandemic crisis modes, there is an opportunity to think broadly about strengthening public health systems, policymaking, and the underlying structure of our social compacts.

The stakes are high: an enduring lesson from history is that moments of crisis often recast the roles of governments and the rights of individuals. Moments of crisis often recast the roles of governments and the rights of individuals.In this moment of flux, the Social Science Research Council calls on policymakers, technologists, data scientists, health experts, academics, activists, and communities around the world to assess the implications of this transformation and seize opportunities for positive social change. The Council seeks to facilitate a shift from reactive modes of crisis response to more strategic forms of deliberation among varied stakeholders. As such, it has convened discussions and directed research in order to better understand the intersection of governance and technologically enabled surveillance in conditions of public health emergencies. Through these activities, the Council aims to provide analysis that can help foster societies that are more resilient, democratic, and inclusive and can, therefore, better withstand future crises.

With these goals in mind, the Council convened a cross-disciplinary, multinational group of experts in the summer of 2020 to survey the landscape of human rights and social justice with regard to technologically driven public health practices. The resulting group—the Public Health, Surveillance, and Human Rights (PHSHR) Network—raised a broad range of questions about governance, social inequalities, data protection, medical systems, and community norms: What rules should govern the sharing of personal health data? How should the efficacy of public health interventions be weighed against the emergence and expansion of new forms of surveillance? How much control should multinational corporations have in designing and implementing nations’ public health technology systems? These are among the questions that pushed members to think beyond traditional professional, geographic, and intellectual boundaries….(More)”.

Balancing Privacy With Data Sharing for the Public Good


David Deming at the New York Times: “Governments and technology companies are increasingly collecting vast amounts of personal data, prompting new laws, myriad investigations and calls for stricter regulation to protect individual privacy.

Yet despite these issues, economics tells us that society needs more data sharing rather than less, because the benefits of publicly available data often outweigh the costs. Public access to sensitive health records sped up the development of lifesaving medical treatments like the messenger-RNA coronavirus vaccines produced by Moderna and Pfizer. Better economic data could vastly improve policy responses to the next crisis.

Data increasingly powers innovation, and it needs to be used for the public good, while individual privacy is protected. This is new and unfamiliar terrain for policymaking, and it requires a careful approach.

The pandemic has brought the increasing dominance of big, data-gobbling tech companies into sharp focus. From online retail to home entertainment, digitally savvy businesses are collecting data and deploying it to anticipate product demand and set prices, lowering costs and outwitting more traditional competitors.

Data provides a record of what has already happened, but its main value comes from improving predictions. Companies like Amazon choose products and prices based on what you — and others like you — bought in the past. Your data improves their decision-making, boosting corporate profits.

Private companies also depend on public data to power their businesses. Redfin and Zillow disrupted the real estate industry thanks to access to public property databases. Investment banks and consulting firms make economic forecasts and sell insights to clients using unemployment and earnings data collected by the Department of Labor. By 2013, one study estimated, public data contributed at least $3 trillion per year to seven sectors of the economy worldwide.

The buzzy refrain of the digital age is that “data is the new oil,” but this metaphor is inaccurate. Data is indeed the fuel of the information economy, but it is more like solar energy than oil — a renewable resource that can benefit everyone at once, without being diminished….(More)”.

A.I. Here, There, Everywhere


Craig S. Smith at the New York Times: “I wake up in the middle of the night. It’s cold.

“Hey, Google, what’s the temperature in Zone 2,” I say into the darkness. A disembodied voice responds: “The temperature in Zone 2 is 52 degrees.” “Set the heat to 68,” I say, and then I ask the gods of artificial intelligence to turn on the light.

Many of us already live with A.I., an array of unseen algorithms that control our Internet-connected devices, from smartphones to security cameras and cars that heat the seats before you’ve even stepped out of the house on a frigid morning.

But, while we’ve seen the A.I. sun, we have yet to see it truly shine.

Researchers liken the current state of the technology to cellphones of the 1990s: useful, but crude and cumbersome. They are working on distilling the largest, most powerful machine-learning models into lightweight software that can run on “the edge,” meaning small devices such as kitchen appliances or wearables. Our lives will gradually be interwoven with brilliant threads of A.I.

Our interactions with the technology will become increasingly personalized. Chatbots, for example, can be clumsy and frustrating today, but they will eventually become truly conversational, learning our habits and personalities and even develop personalities of their own. But don’t worry, the fever dreams of superintelligent machines taking over, like HAL in “2001: A Space Odyssey,” will remain science fiction for a long time to come; consciousness, self-awareness and free will in machines are far beyond the capabilities of science today.

Privacy remains an issue, because artificial intelligence requires data to learn patterns and make decisions. But researchers are developing methods to use our data without actually seeing it — so-called federated learning, for example — or encrypt it in ways that currently can’t be hacked….(More)”

My Data, My Choice? – German Patient Organizations’ Attitudes towards Big Data-Driven Approaches in Personalized Medicine. An Empirical-Ethical Study


Paper by Carolin Martina Rauter, Sabine Wöhlke & Silke Schicktanz: “Personalized medicine (PM) operates with biological data to optimize therapy or prevention and to achieve cost reduction. Associated data may consist of large variations of informational subtypes e.g. genetic characteristics and their epigenetic modifications, biomarkers or even individual lifestyle factors. Present innovations in the field of information technology have already enabled the procession of increasingly large amounts of such data (‘volume’) from various sources (‘variety’) and varying quality in terms of data accuracy (‘veracity’) to facilitate the generation and analyzation of messy data sets within a short and highly efficient time period (‘velocity’) to provide insights into previously unknown connections and correlations between different items (‘value’). As such developments are characteristics of Big Data approaches, Big Data itself has become an important catchphrase that is closely linked to the emerging foundations and approaches of PM. However, as ethical concerns have been pointed out by experts in the debate already, moral concerns by stakeholders such as patient organizations (POs) need to be reflected in this context as well. We used an empirical-ethical approach including a website-analysis and 27 telephone-interviews for gaining in-depth insight into German POs’ perspectives on PM and Big Data. Our results show that not all POs are stakeholders in the same way. Comparing the perspectives and political engagement of the minority of POs that is currently actively involved in research around PM and Big Data-driven research led to four stakeholder sub-classifications: ‘mediators’ support research projects through facilitating researcher’s access to the patient community while simultaneously selecting projects they preferably support while ‘cooperators’ tend to contribute more directly to research projects by providing and implemeting patient perspectives. ‘Financers’ provide financial resources. ‘Independents’ keep control over their collected samples and associated patient-related information with a strong interest in making autonomous decisions about its scientific use. A more detailed terminology for the involvement of POs as stakeholders facilitates the adressing of their aims and goals. Based on our results, the ‘independents’ subgroup is a promising candidate for future collaborations in scientific research. Additionally, we identified gaps in PO’s knowledge about PM and Big Data. Based on these findings, approaches can be developed to increase data and statistical literacy. This way, the full potential of stakeholder involvement of POs can be made accessible in discourses around PM and Big Data….(More)”.

Designing Data Trusts. Why We Need to Test Consumer Data Trusts Now


Policy Brief by Aline Blankertz: “Data about individuals, about their preferences and behaviors, has become an increasingly important resource for companies, public agencies, and research institutions. Consumers carry the burden of having to decide which data about them is shared for which purpose. They want to make sure that data about them is not used to infer intimate details of their private life or to pursue other undesirable purposes. At the same time, they want to benefit from personalized products and innovation driven by the same data. The complexity of how data is collected and used overwhelms consumers, many of whom wearily accept privacy policies and lose trust that those who gain effective control over the data will use it for the consumers’ benefit.

At the same time, a few large companies accumulate and lock in vast amounts of data that enable them to use insights across markets and across consumers. In Europe, the General Data Protection Regulation (GDPR) has given data rights to consumers to assert their interests vis-a-vis those companies, but it gives consumers neither enough information nor enough power to make themselves heard. Other organizations, especially small businesses or start-ups, do not have access to the data (unless individual consumers laboriously exercise their right to portability), which often inhibits competition and innovation. Governments across Europe would like to tackle the challenge of reconciling productive data use with privacy. In recent months, data trusts have emerged as a promising solution to enable data-sharing for the benefit of consumers.

The concept has been endorsed by a broad range of stakeholders, including privacy advocates, companies and expert commissions. In Germany, for example, the data ethics commission and the commission competition law 4.0 have recommended further exploring data trusts, and the government is incorporating the concept into its data strategy.

There is no common understanding yet what consumer data trusts are and what they do. In order for them to address the problems mentioned, it is helpful to use as a working definition: consumer data trusts are intermediaries that aggregate consumers’ interests and represent them vis-à-vis data-using organizations. Data trusts use more technical and legal expertise, as well as greater bargaining power, to negotiate with organizations on the conditions of data use to achieve better outcomes than those that individual consumers can achieve. To achieve their consumer-oriented mission, data trusts should be able to assign access rights, audit data practices, and support enforcement. They may or may not need to hold data…(More)”.

Regulation of Algorithmic Tools in the United States


Paper by Christopher S. Yoo and Alicia Lai: “Policymakers in the United States have just begun to address regulation of artificial intelligence technologies in recent years, gaining momentum through calls for additional research funding, piece-meal guidance, proposals, and legislation at all levels of government. This Article provides an overview of high-level federal initiatives for general artificial intelligence (AI) applications set forth by the U.S. president and responding agencies, early indications from the incoming Biden Administration, targeted federal initiatives for sector-specific AI applications, pending federal legislative proposals, and state and local initiatives. The regulation of the algorithmic ecosystem will continue to evolve as the United States continues to search for the right balance between ensuring public safety and transparency and promoting innovation and competitiveness on the global stage….(More)”.

Inside the ‘Wikipedia of Maps,’ Tensions Grow Over Corporate Influence


Corey Dickinson at Bloomberg: “What do Lyft, Facebook, the International Red Cross, the U.N., the government of Nepal and Pokémon Go have in common? They all use the same source of geospatial data: OpenStreetMap, a free, open-source online mapping service akin to Google Maps or Apple Maps. But unlike those corporate-owned mapping platforms, OSM is built on a network of mostly volunteer contributors. Researchers have described it as the “Wikipedia for maps.”

Since it launched in 2004, OpenStreetMap has become an essential part of the world’s technology infrastructure. Hundreds of millions of monthly users interact with services derived from its data, from ridehailing apps, to social media geotagging on Snapchat and Instagram, to humanitarian relief operations in the wake of natural disasters. 

But recently the map has been changing, due the growing impact of private sector companies that rely on it. In a 2019 paper published in the ISPRS International Journal of Geo-Information, a cross-institutional team of researchers traced how Facebook, Apple, Microsoft and other companies have gained prominence as editors of the map. Their priorities, the researchers say, are driving significant change to what is being mapped compared to the past. 

“OpenStreetMap’s data is crowdsourced, which has always made spectators to the project a bit wary about the quality of the data,” says Dipto Sarkar, a professor of geoscience at Carleton University in Ottawa, and one of the paper’s co-authors. “As the data becomes more valuable and is used for an ever-increasing list of projects, the integrity of the information has to be almost perfect. These companies need to make sure there’s a good map of the places they want to expand in, and nobody else is offering that, so they’ve decided to fill it in themselves.”…(More)”.

Centre for Applied Data Ethics Strategy – Enabling ethically appropriate research and statistics for the public good


Foreword by Professor Sir Ian Diamond: “I am delighted to introduce the UK Statistics Authority’s new Centre for Applied Data Ethics which we committed to establishing in the UK Statistics Authority’s five-year strategy published last year. Being able to show that researchers, statisticians and analysts have not only considered how they can use data but also how they should use data from an ethical perspective is vital to ensuring public acceptability around the use of data for research and statistical purposes. For this reason, I believe that it is important that as the UK’s national statistical institute, we play a lead role in providing statisticians, researchers and analysts with applied sources of advice, guidance and other tools to help them ensure they use data in ethically appropriate ways. I have therefore established the UK Statistics Authority’s Centre for Applied Data Ethics with the aim of being recognised as world-leaders in the practical application of data ethics for statistics and research.

The new Centre will build on the excellent work of the National Statistician’s Data Ethics Advisory Committee that will continue to provide me with valuable independent data ethics advice and assurance about the collection and use of data for research and statistics.

The significant role the analytical community across Government and beyond has played in informing the response to the ongoing Covid-19 pandemic highlights the importance of using data in new ways to produce timely statistics, research and analysis to inform the important policy questions of the day. Demonstrating how we apply the principles of good data ethics is an important part of this and is a key enabler, and safeguard, to unlock the power of data for better research and statistics for the public good. By focussing our efforts on providing practical data ethics support and guidance to researchers collecting and using data, the Centre for Applied Data Ethics will help the UK Statistics Authority to meet its strategic objectives of producing statistics for the public good to inform the UK, improve lives and build for the future….(More)”.