Technological Obsolescence

Essay by Jonathan Coopersmith: “In addition to killing over a million Americans, Covid-19 revealed embarrassing failures of local, state, and national public health systems to accurately and effectively collect, transmit, and process information. To some critics and reporters, the visible and easily understood face of those failures was the continued use of fax machines.

In reality, the critics were attacking the symptom, not the problem. Instead of “why were people still using fax machines?,” the better question was “what factors made fax machines more attractive than more capable technologies?” Those answers provide a better window into the complex, evolving world of technological obsolescence, a key component of our modern world—and on a smaller scale, provide a template to decide whether the NAE and other organizations should retain their fax machines.

The marketing dictionary of Monash University Business School defines technological obsolescence as “when a technical product or service is no longer needed or wanted even though it could still be in working order.” Significantly, the source is a business school, which implies strong economic and social factors in decision making about technology.  

Determining technological obsolescence depends not just on creators and promoters of new technologies but also on users, providers, funders, accountants, managers, standards setters—and, most importantly, competing needs and options. In short, it’s complicated.  

Like most aspects of technology, perspectives on obsolescence depend on your position. If existing technology meets your needs, upgrading may not seem worth the resources needed (e.g., for purchase and training). If, on the other hand, your firm or organization depends on income from providing, installing, servicing, training, advising, or otherwise benefiting from a new technology, not upgrading could jeopardize your future, especially in a very competitive market. And if you cannot find the resources to upgrade, you—and your users—may incur both visible and invisible costs…(More)”.

WHO Launches Global Infectious Disease Surveillance Network

Article by Shania Kennedy: “The World Health Organization (WHO) launched the International Pathogen Surveillance Network (IPSN), a public health network to prevent and detect infectious disease threats before they become epidemics or pandemics.

IPSN will rely on insights generated from pathogen genomics, which helps analyze the genetic material of viruses, bacteria, and other disease-causing micro-organisms to determine how they spread and how infectious or deadly they may be.

Using these data, researchers can identify and track diseases to improve outbreak prevention, response, and treatments.

“The goal of this new network is ambitious, but it can also play a vital role in health security: to give every country access to pathogen genomic sequencing and analytics as part of its public health system,” said WHO Director-General Tedros Adhanom Ghebreyesus, PhD, in the press release.  “As was so clearly demonstrated to us during the COVID-19 pandemic, the world is stronger when it stands together to fight shared health threats.”

Genomics capacity worldwide was scaled up during the pandemic, but the press release indicates that many countries still lack effective tools and systems for public health data collection and analysis. This lack of resources and funding could slow the development of a strong global health surveillance infrastructure, which IPSN aims to help address.

The network will bring together experts in genomics and data analytics to optimize routine disease surveillance, including for COVID-19. According to the press release, pathogen genomics-based analyses of the SARS-COV-2 virus helped speed the development of effective vaccines and the identification of more transmissible virus variants…(More)”.

German lawmakers mull creating first citizen assembly

APNews: “German lawmakers considered Wednesday whether to create the country’s first “citizen assembly’” to advise parliament on the issue of food and nutrition.

Germany’s three governing parties back the idea of appointing consultative bodies made up of members of the public selected through a lottery system who would discuss specific topics and provide nonbinding feedback to legislators. But opposition parties have rejected the idea, warning that such citizen assemblies risk undermining the primacy of parliament in Germany’s political system.

Baerbel Bas, the speaker of the lower house, or Bundestag, said that she views such bodies as a “bridge between citizens and politicians that can provide a fresh perspective and create new confidence in established institutions.”

“Everyone should be able to have a say,” Bas told daily Passauer Neue Presse. “We want to better reflect the diversity in our society.”

Environmental activists from the group Last Generation have campaigned for the creation of a citizen assembly to address issues surrounding climate change. However, the group argues that proposals drawn up by such a body should at the very least result in bills that lawmakers would then vote on.

Similar efforts to create citizen assemblies have taken place in other European countries such as Spain, Finland, Austria, Britain and Ireland…(More)”.

The Curious Side Effects of Medical Transparency

Essay by Danielle Ofri: “Transparency, Pozen told me, “invites conceptual confusion about whether it’s a first-order good that we’re trying to pursue for its own sake, or a second-order good that we’re trying to use instrumentally to achieve other goods.” In the first case, we might feel that transparency is an ideal always worth embracing, whatever the costs. In the second, we might ask ourselves what it’s accomplishing, and how it compares with other routes to the same end.

“There is a standard view that transparency is all good—the more transparency, the better,” the philosopher C. Thi Nguyen, an associate professor at the University of Utah, told me. But “you have a completely different experience of transparency when you are the subject.” In a previous position, Nguyen had been part of a department that had to provide evidence that it was using state funding to satisfactorily educate its students. Philosophers, he told me, would want to describe their students’ growing reflectiveness, curiosity, and “intellectual humility,” but knew that this kind of talk would likely befuddle or bore legislators; they had to focus instead on concrete numbers, such as graduation rates and income after graduation. Nguyen and his colleagues surely want their students to graduate and earn a living wage, but such stats hardly sum up what it means to be a successful philosopher.

In Nguyen’s view, this illustrates a problem with transparency. “In any scheme of transparency in which you have experts being transparent to nonexperts, you’re going to get a significant amount of information loss,” he said. What’s meaningful in a philosophy department can be largely incomprehensible to non-philosophers, so the information must be recast in simplified terms. Furthermore, simplified metrics frequently distort incentives. If graduation rates are the metric by which funding is determined, then a school might do whatever it takes to bolster them. Although some of these efforts might add value to students’ learning, it’s also possible to game the system in ways that are counterproductive to actual education.

Transparency is often portrayed as objective, but, like a camera, it is subject to manipulation even as it appears to be relaying reality. Ida Koivisto, a legal scholar at the University of Helsinki, has studied the trade-offs that flow from who holds that camera. She finds that when an authority—a government agency, a business, a public figure—elects to be transparent, people respond positively, concluding that the willingness to be open reflects integrity, and thus confers legitimacy. Since the authority has initiated this transparency, however, it naturally chooses to be transparent in areas where it looks good. Voluntary transparency sacrifices a degree of truth. On the other hand, when transparency is initiated by outside forces—mandates, audits, investigations—both the good and the bad are revealed. Such involuntary transparency is more truthful, but it often makes its subject appear flawed and dishonest, and so less legitimate. There’s a trade-off, Koivisto concludes, between “legitimacy” and “the ‘naked truth.’ ”..(More)”.

End of data sharing could make Covid-19 harder to control, experts and high-risk patients warn

Article by Sam Whitehead: “…The federal government’s public health emergency that’s been in effect since January 2020 expires May 11. The emergency declaration allowed for sweeping changes in the U.S. health care system, like requiring state and local health departments, hospitals, and commercial labs to regularly share data with federal officials.

But some shared data requirements will come to an end and the federal government will lose access to key metrics as a skeptical Congress seems unlikely to grant agencies additional powers. And private projects, like those from The New York Times and Johns Hopkins University, which made covid data understandable and useful for everyday people, stopped collecting data in March.

Public health legal scholars, data experts, former and current federal officials, and patients at high risk of severe covid outcomes worry the scaling back of data access could make it harder to control covid.

There have been improvements in recent years, such as major investments in public health infrastructure and updated data reporting requirements in some states. But concerns remain that the overall shambolic state of U.S. public health data infrastructure could hobble the response to any future threats.

“We’re all less safe when there’s not the national amassing of this information in a timely and coherent way,” said Anne Schuchat, former principal deputy director of the Centers for Disease Control and Prevention.

A lack of data in the early days of the pandemic left federal officials, like Schuchat, with an unclear picture of the rapidly spreading coronavirus. And even as the public health emergency opened the door for data-sharing, the CDC labored for months to expand its authority.

Eventually, more than a year into the pandemic, the CDC gained access to data from private health care settings, such as hospitals and nursing homes, commercial labs, and state and local health departments…(More)”. See also: Why we still need data to understand the COVID-19 pandemic

Responding to the coronavirus disease-2019 pandemic with innovative data use: The role of data challenges

Paper by Jamie Danemayer, Andrew Young, Siobhan Green, Lydia Ezenwa and Michael Klein: “Innovative, responsible data use is a critical need in the global response to the coronavirus disease-2019 (COVID-19) pandemic. Yet potentially impactful data are often unavailable to those who could utilize it, particularly in data-poor settings, posing a serious barrier to effective pandemic mitigation. Data challenges, a public call-to-action for innovative data use projects, can identify and address these specific barriers. To understand gaps and progress relevant to effective data use in this context, this study thematically analyses three sets of qualitative data focused on/based in low/middle-income countries: (a) a survey of innovators responding to a data challenge, (b) a survey of organizers of data challenges, and (c) a focus group discussion with professionals using COVID-19 data for evidence-based decision-making. Data quality and accessibility and human resources/institutional capacity were frequently reported limitations to effective data use among innovators. New fit-for-purpose tools and the expansion of partnerships were the most frequently noted areas of progress. Discussion participants identified building capacity for external/national actors to understand the needs of local communities can address a lack of partnerships while de-siloing information. A synthesis of themes demonstrated that gaps, progress, and needs commonly identified by these groups are relevant beyond COVID-19, highlighting the importance of a healthy data ecosystem to address emerging threats. This is supported by data holders prioritizing the availability and accessibility of their data without causing harm; funders and policymakers committed to integrating innovations with existing physical, data, and policy infrastructure; and innovators designing sustainable, multi-use solutions based on principles of good data governance…(More)”.

The limits of expert judgment: Lessons from social science forecasting during the pandemic

Article by Cendri Hutcherson  Michael Varnum Imagine being a policymaker at the beginning of the COVID-19 pandemic. You have to decide which actions to recommend, how much risk to tolerate and what sacrifices to ask your citizens to bear.

Who would you turn to for an accurate prediction about how people would react? Many would recommend going to the experts — social scientists. But we are here to tell you this would be bad advice.

As psychological scientists with decades of combined experience studying decision-makingwisdomexpert judgment and societal change, we hoped social scientists’ predictions would be accurate and useful. But we also had our doubts.

Our discipline has been undergoing a crisis due to failed study replications and questionable research practices. If basic findings can’t be reproduced in controlled experiments, how confident can we be that our theories can explain complex real-world outcomes?

To find out how well social scientists could predict societal change, we ran the largest forecasting initiative in our field’s history using predictions about change in the first year of the COVID-19 pandemic as a test case….

Our findings, detailed in peer-reviewed papers in Nature Human Behaviour and in American Psychologist, paint a sobering picture. Despite the causal nature of most theories in the social sciences, and the fields’ emphasis on prediction in controlled settings, social scientists’ forecasts were generally not very good.

In both papers, we found that experts’ predictions were generally no more accurate than those made by samples of the general public. Further, their predictions were often worse than predictions generated by simple statistical models.

Our studies did still give us reasons to be optimistic. First, forecasts were more accurate when teams had specific expertise in the domain they were making predictions in. If someone was an expert in depression, for example, they were better at predicting societal trends in depression.

Second, when teams were made up of scientists from different fields working together, they tended to do better at forecasting. Finally, teams that used simpler models to generate their predictions and made use of past data generally outperformed those that didn’t.

These findings suggest that, despite the poor performance of the social scientists in our studies, there are steps scientists can take to improve their accuracy at this type of forecasting….(More)”.

The pandemic veneer: COVID-19 research as a mobilisation of collective intelligence by the global research community

Paper by Daniel W Hook and James R Wilsdon: “The global research community responded with speed and at scale to the emergence of COVID-19, with around 4.6% of all research outputs in 2020 related to the pandemic. That share almost doubled through 2021, to reach 8.6% of research outputs. This reflects a dramatic mobilisation of global collective intelligence in the face of a crisis. It also raises fundamental questions about the funding, organisation and operation of research. In this Perspective article, we present data that suggests that COVID-19 research reflects the characteristics of the underlying networks from which it emerged, and on which it built. The infrastructures on which COVID-19 research has relied – including highly skilled, flexible research capacity and collaborative networks – predated the pandemic, and are the product of sustained, long-term investment. As such, we argue that COVID-19 research should not be viewed as a distinct field, or one-off response to a specific crisis, but as a ‘pandemic veneer’ layered on top of longstanding interdisciplinary networks, capabilities and structures. These infrastructures of collective intelligence need to be better understood, valued and sustained as crucial elements of future pandemic or crisis response…(More)”.

To harness telecom data for good, there are six challenges to overcome

Blog by Anat Lewin and Sveta Milusheva: “The global use of mobile phones generates a vast amount of data. What good can be done with these data? During the COVID-19 pandemic, we saw that aggregated data from mobile phones can tell us where groups of humans are going, how many of them are there, and how they are behaving as a cluster. When used effectively and responsibly, mobile phone data can be immensely helpful for development work and emergency response — particularly in resource-constrained countries.  For example, an African country that had, in recent years, experienced a cholera outbreak was ahead of the game. Since the legal and practical agreements were already in place to safely share aggregated mobile data, accessing newer information to support epidemiological modeling for COVID-19 was a straightforward exercise. The resulting datasets were used to produce insightful analyses that could better inform health, lockdown, and preventive policy measures in the country.

To better understand such challenges and opportunities, we led an effort to access and use anonymized, aggregated mobile phone data across 41 countries. During this process, we identified several recurring roadblocks and replicable successes, which we summarized in a paper along with our lessons learned. …(More)”.

Data Collaborative Case Study: NYC Recovery Data Partnership

Report by the Open Data Policy Lab (The GovLab): “In July 2020, following severe economic and social losses due to the COVID-19 pandemic, the administration of New York City Mayor Bill de Blasio announced the NYC Recovery Data Partnership. This data collaborative asked private and civic organizations with assets relevant to New York City to provide their data to the city. Senior city leaders from the First Deputy Mayor’s Office, the Mayor’s Office of Operations, Mayor’s Office of Information Privacy and Mayor’s Office of Data Analytics formed an internal coalition which served as trusted intermediaries, assessing agency requests from city agencies to use the data provided and allocating access accordingly. The data informed internal research conducted by various city agencies, including New York City Emergency Management’s Recovery Team and the NYC…(More)”Department of City Planning. The experience reveals the ability of crises to spur innovation, the value of responsiveness from both data users and data suppliers, and the importance of technical capacity, and the value of a network of peers. In terms of challenges, the experience also exposes the limitations of data, the challenges of compiling complex datasets, and the role of resource constraints.