The Doctor Who Wasn’t There: Technology, History, and the Limits of Telehealth


Book by Jeremy A. Greene: “The Doctor Who Wasn’t There traces the long arc of enthusiasm for—and skepticism of—electronic media in health and medicine. Over the past century, a series of new technologies promised to democratize access to healthcare. From the humble telephone to the connected smartphone, from FM radio to wireless wearables, from cable television to the “electronic brains” of networked mainframe computers: each new platform has promised a radical reformation of the healthcare landscape. With equal attention to the history of technology, the history of medicine, and the politics and economies of American healthcare, physician and historian Jeremy A. Greene explores the role that electronic media play, for better and for worse, in the past, present, and future of our health.

Today’s telehealth devices are far more sophisticated than the hook-and-ringer telephones of the 1920s, the radios that broadcasted health data in the 1940s, the closed-circuit televisions that enabled telemedicine in the 1950s, or the online systems that created electronic medical records in the 1960s. But the ethical, economic, and logistical concerns they raise are prefigured in the past, as are the gaps between what was promised and what was delivered. Each of these platforms also produced subtle transformations in health and healthcare that we have learned to forget, displaced by promises of ever newer forms of communication that took their place. 

Illuminating the social and technical contexts in which electronic medicine has been conceived and put into practice, Greene’s history shows the urgent stakes, then and now, for those who would seek in new media the means to build a more equitable future for American healthcare….(More)”.

Who owns the map? Data sovereignty and government spatial data collection, use, and dissemination


Paper by Peter A. Johnson and Teresa Scassa: “Maps, created through the collection, assembly, and analysis of spatial data are used to support government planning and decision-making. Traditionally, spatial data used to create maps are collected, controlled, and disseminated by government, although over time, this role has shifted. This shift has been driven by the availability of alternate sources of data collected by private sector companies, and data contributed by volunteers to open mapping platforms, such as OpenStreetMap. In theorizing this shift, we provide examples of how governments use data sovereignty as a tool to shape spatial data collection, use, and sharing. We frame four models of how governments may navigate shifting spatial data sovereignty regimes; first, with government retaining complete control over data collection; second, with government contracting a third party to provide specific data collection services, but with data ownership and dissemination responsibilities resting with government; third, with government purchasing data under terms of access set by third party data collectors, who disseminate data to several parties, and finally, with government retreating from or relinquishing data sovereignty altogether. Within this rapidly changing landscape of data providers, we propose that governments must consider how to address data sovereignty concerns to retain their ability to control data use in the public interest…(More)”.

How Smart Are the Robots Getting?


Cade Metz at The New York Times: “…These are not systems that anyone can properly evaluate with the Turing test — or any other simple method. Their end goal is not conversation.

Researchers at Google and DeepMind, which is owned by Google’s parent company, are developing tests meant to evaluate chatbots and systems like DALL-E, to judge what they do well, where they lack reason and common sense, and more. One test shows videos to artificial intelligence systems and asks them to explain what has happened. After watching someone tinker with an electric shaver, for instance, the A.I. must explain why the shaver did not turn on.

These tests feel like academic exercises — much like the Turing test. We need something that is more practical, that can really tell us what these systems do well and what they cannot, how they will replace human labor in the near term and how they will not.

We could also use a change in attitude. “We need a paradigm shift — where we no longer judge intelligence by comparing machines to human behavior,” said Oren Etzioni, professor emeritus at the University of Washington and founding chief executive of the Allen Institute for AI, a prominent lab in Seattle….

At the same time, there are many ways these bots are superior to you and me. They do not get tired. They do not let emotion cloud what they are trying to do. They can instantly draw on far larger amounts of information. And they can generate text, images and other media at speeds and volumes we humans never could.

Their skills will also improve considerably in the coming years.

Researchers can rapidly hone these systems by feeding them more and more data. The most advanced systems, like ChatGPT, require months of training, but over those months, they can develop skills they did not exhibit in the past.

“We have found a set of techniques that scale effortlessly,” said Raia Hadsell, senior director of research and robotics at DeepMind. “We have a simple, powerful approach that continues to get better and better.”

The exponential improvement we have seen in these chatbots over the past few years will not last forever. The gains may soon level out. But even then, multimodal systems will continue to improve — and master increasingly complex skills involving images, sounds and computer code. And computer scientists will combine these bots with systems that can do things they cannot. ChatGPT failed Turing’s chess test. But we knew in 1997 that a computer could beat the best humans at chess. Plug ChatGPT into a chess program, and the hole is filled.

In the months and years to come, these bots will help you find information on the internet. They will explain concepts in ways you can understand. If you like, they will even write your tweets, blog posts and term papers.

They will tabulate your monthly expenses in your spreadsheets. They will visit real estate websites and find houses in your price range. They will produce online avatars that look and sound like humans. They will make mini-movies, complete with music and dialogue…

Certainly, these bots will change the world. But the onus is on you to be wary of what these systems say and do, to edit what they give you, to approach everything you see online with skepticism. Researchers know how to give these systems a wide range of skills, but they do not yet know how to give them reason or common sense or a sense of truth.

That still lies with you…(More)”.

The Autocrat in Your iPhone


Article by Ronald J. Deibert: “In the summer of 2020, a Rwandan plot to capture exiled opposition leader Paul Rusesabagina drew international headlines. Rusesabagina is best known as the human rights defender and U.S. Presidential Medal of Freedom recipient who sheltered more than 1,200 Hutus and Tutsis in a hotel during the 1994 Rwandan genocide. But in the decades after the genocide, he also became a prominent U.S.-based critic of Rwandan President Paul Kagame. In August 2020, during a layover in Dubai, Rusesabagina was lured under false pretenses into boarding a plane bound for Kigali, the Rwandan capital, where government authorities immediately arrested him for his affiliation with an opposition group. The following year, a Rwandan court sentenced him to 25 years in prison, drawing the condemnation of international human rights groups, the European Parliament, and the U.S. Congress. 

Less noted at the time, however, was that this brazen cross-border operation may also have employed highly sophisticated digital surveillance. After Rusesabagina’s sentencing, Amnesty International and the Citizen Lab at the University of Toronto, a digital security research group I founded and direct, discovered that smartphones belonging to several of Rusesabagina’s family members who also lived abroad had been hacked by an advanced spyware program called Pegasus. Produced by the Israel-based NSO Group, Pegasus gives an operator near-total access to a target’s personal data. Forensic analysis revealed that the phone belonging to Rusesabagina’s daughter Carine Kanimba had been infected by the spyware around the time her father was kidnapped and again when she was trying to secure his release and was meeting with high-level officials in Europe and the U.S. State Department, including the U.S. special envoy for hostage affairs. NSO Group does not publicly identify its government clients and the Rwandan government has denied using Pegasus, but strong circumstantial evidence points to the Kagame regime.

In fact, the incident is only one of dozens of cases in which Pegasus or other similar spyware technology has been found on the digital devices of prominent political opposition figures, journalists, and human rights activists in many countries. Providing the ability to clandestinely infiltrate even the most up-to-date smartphones—the latest “zero click” version of the spyware can penetrate a device without any action by the user—Pegasus has become the digital surveillance tool of choice for repressive regimes around the world. It has been used against government critics in the United Arab Emirates (UAE) and pro-democracy protesters in Thailand. It has been deployed by Mohammed bin Salman’s Saudi Arabia and Viktor Orban’s Hungary…(More)”.

Responding to societal challenges with data: Access, sharing, stewardship and control


OECD Report: “Data access, sharing and re-use (“data openness”) can generate significant social and economic benefits, including addressing public health emergencies such as the COVID-19 pandemic and achieving the Sustainable Development Goals. However, data openness also comes with risks to individuals and organisations – notably risks to privacy and data protection, intellectual property rights, digital and national security. It also raises ethical concerns where data access, sharing and re-use undermine ethical values and norms. This report demonstrates how approaches to data stewardship and control that are more balanced and differentiated can maximise the benefits of data, while protecting individuals’ and organisations’ rights and taking into account other legitimate interests and public policy objectives. It presents the mix of technical, organisational and legal approaches that characterises these more balanced and differentiated approaches, and how governments have implemented them…(More)”

2023 Edelman Trust Barometer


Press Release: “The 2023 Edelman Trust Barometer reveals that business is now viewed as the only global institution to be both competent and ethical. Business now holds a staggering 53-point lead over government in competence and is 30 points ahead on ethics. Its treatment of workers during the pandemic and return to work, along with the swift and decisive action of over 1,000 businesses to exit Russia after its invasion of Ukraine helped fuel a 20-point jump on ethics over the past three years. Business (62 percent) remains the most and only trusted institution globally. …

Other key findings from the 2023 Edelman Trust Barometer include:

  • Personal economic fears such as job loss (89 percent) and inflation (74 percent) are on par with urgent societal fears like climate change (76 percent), nuclear war (72 percent) and food shortages (67 percent).
  • CEOs are expected to use resources to hold divisive forces accountable: 72 percent believe CEOs are obligated to defend facts and expose questionable science being used to justify bad social policy; 71 percent believe CEOs are obligated to pull advertising money out of media platforms that spread misinformation; and 64 percent, on average, say companies can help increase civility and strengthen the social fabric by supporting politicians and media outlets that build consensus and cooperation.
  • Government (51 percent) is now distrusted in 16 of the 28 countries surveyed including the U.S. (42 percent), the UK (37 percent), Japan (33 percent), and Argentina (20 percent). Media (50 percent) is distrusted in 15 of 28 countries including Germany (47 percent), the U.S. (43 percent), Australia (38 percent), and South Korea (27 percent). ‘My employer’ (77 percent) is the most trusted institution and is trusted in every country surveyed aside from South Korea (54 percent).
  • Government leaders (41 percent), journalists (47 percent) and CEOs (48 percent) are the least trusted institutional leaders. Scientists (76 percent), my coworkers (73 percent among employees) and my CEO (64 percent among employees) are most trusted.
  • Technology (75 percent) was once again the most trusted sector trailed by education (71 percent), food & beverage (71 percent) and healthcare (70 percent). Social media (44 percent) remained the least trusted sector.
  • Canada (67 percent) and Germany (63 percent) remained the two most trusted foreign brands, followed by Japan (61 percent) and the UK (59 percent). India (34 percent) and China (32 percent) remain the least trusted..(More)”.

Data Free Flow with Trust: Overcoming Barriers to Cross-Border Data Flows


Briefing Paper by the WEF: “The movement of data across country borders is essential to the global economy. When data flows across borders, it is possible to deliver more to more people and produce more benefits for people and planet. This briefing paper highlights the importance of such data flows and urges global leaders in the public and private sectors to take collective action to work towards a shared understanding of them with a view to implementing “Data Free Flow with Trust” (DFFT) – an umbrella concept for facilitating trust-based data exchanges. This paper reviews the current challenges facing DFFT, take stock of progress made so far, offer direction for policy mechanisms and concrete tools for businesses and, more importantly, promote global discussions about how to realize DFFT from the perspectives of policy and business…(More)”.

Five Conjectures to Explore in 2023 as They Relate to Data for Good


Essay by Hannah Chafetz, Uma Kalkar, Marine Ragnet, Stefaan Verhulst: “From the regulations proposed in the European Artificial Intelligence (AI) Act to the launch of OpenAI’s ChatGPT tool, 2022 was a year that saw many policy and technological developments. Taking stock of recent data and technology trends, we offer some conjectures as to how these ideas may play out over the next year. Indeed, predictions can be dangerous, which is why we position the below as conjectures — propositions that remain tentative till more evidence emerges — that can help advance the agenda and direction of responsible use of data for the public good focus areas.

Below, we provide a summary of the five conjectures that The GovLab will track and revisit throughout 2023.

Conjecture 1. In 2023 … non-traditional data may be used with increasing frequency to solve public problems.

Complex crises, from COVID-19 to climate change, demonstrate a need for information about a variety of developments quickly and at scale. Traditional sources are not enough: growing awareness and (re)use of non-traditional data sources (NTD) to fill the gaps in traditional data cast a spotlight on the value of using and combining new data sources for problem-solving. Over the next year, NTD sources could increasingly be called upon by decision-making to address large-scale public problems.

NTD refers to data that is “digitally captured (for example, mobile phone records and financial data), mediated (for example, social media and online data), or observed (for example, satellite imagery),” using new instrumentation mechanisms and is often privately held. Our recent report discussed how COVID-19 was a “watershed moment” in terms of generating access to non-traditional health, mobility, economic, and sentiment data. As detailed in the report, decision-makers around the world increasingly recognize the potential of NTD sources when combined with traditional data responsibly. Similarly, developments in the war in Ukraine presented a pivotal moment regarding the use of NTD sources. For instance, satellite images, social media narrative trends, and real-time location mapping have supported humanitarian action and peacebuilding.

These are just two examples of the increasing interest in NTD to solve public problems. We predict that this trend could continue to expand as technological advances continue to make non-traditional data more widely available to decision-makers. Already, the financial sector is increasingly incorporating non-traditional data to inform decisions such as assessing lending risks, for example. Recently, the fintech business Nova Credit and HSBC partnered together to exploit cross-border data to allow immigrants access to credit by predicting creditworthiness via digital footprint and psychometric data. This trend is compounded by increased legislation aiming to open up the re-use of private sector data, particularly in Europe. The increased attention to NTD sources signals a need to prioritize the alignment of the supply and demand of NTD and develop a systematized approach to how it can be integrated within decision-making cycles…(More)”.

Recentring the demos in the measurement of democracy


Article by Seema Shah: “Rethinking how we measure and evaluate democratic performance is vital to reversing a longstanding negative trend in global democracy. We must confront the past, including democracy’s counter-intuitively intrinsic inequality. This is key to revitalising institutions in a way that allows democratic practice to live up to its potential…

In the global democracy assessment space, teams like the one I lead at International IDEA compete to provide the most rigorous, far-reaching and understandable set of democracy measurements in the world. Alexander Hudson explains how critical these indicators are, providing important benchmarks for democratic growth and decline to policymakers, governments, international organisations, and journalists.

Yet in so many ways, the core of what these datasets measure and help assess are largely the same. This redundancy is no doubt at least partially a product of wealthy donors’ prioritisation of liberal democracy as an ideal. It is compounded by how the measures are calculated. As Adam Przeworksi recently stated, reliance on expert coders runs the risk of measuring little other than those experts’ biases.

But if that is the case, and quantitative measurements continue to be necessary for democracy assessment, shouldn’t we rethink exactly what we are measuring and how we are measuring it?..

Democracy assessment indices do not typically measure ordinary people’s evaluations of the state of democracy. Instead, other specialised ‘barometers’ often take on this task. See, for example, AfrobarometerEurobarometerAsian Barometer, and LatinobarometroSurveys of public perceptions on a range of issues also exist, including, but not limited to democracy. The problem is, however, that these do not systematically make it into overall democracy assessments or onto policymakers’ desks. This means that policymakers and others do not consistently prioritise or consider lived experiences as they make decisions about democracy and human rights-related funding and interventions…(More)”.

Accelerate Aspirations: Moving Together to Achieve Systems Change


Report by Data.org: “To solve our greatest global challenges, we need to accelerate how we use data for good. But to truly make data-driven tools that serve society, we must re-imagine data for social impact more broadly, more inclusively, and in a more interdisciplinary way. 

So, we face a choice. Business as usual can continue through funding and implementing under-resourced and siloed data projects that deliver incremental progress. Or we can think and act boldly to drive equitable and sustainable solutions. 

Accelerate Aspirations: Moving Together to Achieve Systems Change is a comprehensive report on the key trends and tensions in the emerging field of data for social impact…(More)”.