No more gut-based strategies: Using evidence to solve the digital divide


Gregory Rosston and Scott J. Wallsten at the Hill: “COVID-19 has, among other things, brought home the costs of the digital divide. Numerous op-eds have offered solutions, including increasing subsidies to schools, providing eligible low-income people with a $50 per month broadband credit, funding more digital literacy classes and putting WiFi on school buses. A House bill would allocate $80 billion to ideas meant to close the digital divide.

The key missing component of nearly every proposal to solve the connectivity problem is evidence — evidence suggesting the ideas are likely to work and ways to use evidence in the future to evaluate whether they did work. Otherwise, we are likely throwing money away. Understanding what works and what doesn’t requires data collection and research now and in the future….

Consider President Trump’s belief in hydroxychloroquine as a cure for the novel coronavirus based simply on his “gut.” That resulted in the government ordering the drug to be produced, distributed to hospitals, and 63 million doses put into a strategic national stockpile.

The well-meaning folks offering up multi-billion dollar broadband plans probably recognize the foolhardiness of the president’s gut-check approach to guiding virus treatment plans. But so far, policy makers and advocates are promoting their own gut beliefs that their proposals will treat the digital divide. An evidence-free approach is likely to cost billions of dollars more and connect fewer people than an evidence-based approach.

It doesn’t have to be this way. The pandemic did not only lay bare the implications of the digital divide, it also created a laboratory for studying how best to bridge the divide. The most immediate problem was how to help kids without home broadband attend distance learning classes. Schools had no time to formally study different options — it was a race to find anything that might help. As a result, schools incidentally ran thousands of concurrent experiments around the country….(More)”.

COVID Data Failures Create Pressure for Public Health System Overhaul


Kaiser Health News: “After terrorists slammed a plane into the Pentagon on 9/11, ambulances rushed scores of the injured to community hospitals, but only three of the patients were taken to specialized trauma wards. The reason: The hospitals and ambulances had no real-time information-sharing system.

Nineteen years later, there is still no national data network that enables the health system to respond effectively to disasters and disease outbreaks. Many doctors and nurses must fill out paper forms on COVID-19 cases and available beds and fax them to public health agencies, causing critical delays in care and hampering the effort to track and block the spread of the coronavirus.

There are signs the COVID-19 pandemic has created momentum to modernize the nation’s creaky, fragmented public health data system, in which nearly 3,000 local, state and federal health departments set their own reporting rules and vary greatly in their ability to send and receive data electronically.

Sutter Health and UC Davis Health, along with nearly 30 other provider organizations around the country, recently launched a collaborative effort to speed and improve the sharing of clinical data on individual COVID cases with public health departments.

But even that platform, which contains information about patients’ diagnoses and response to treatments, doesn’t yet include data on the availability of hospital beds, intensive care units or supplies needed for a seamless pandemic response.

The federal government spent nearly $40 billion over the past decade to equip hospitals and physicians’ offices with electronic health record systems for improving treatment of individual patients. But no comparable effort has emerged to build an effective system for quickly moving information on infectious disease from providers to public health agencies.

In March, Congress approved $500 million over 10 years to modernize the public health data infrastructure. But the amount falls far short of what’s needed to update data systems and train staff at local and state health departments, said Brian Dixon, director of public health informatics at the Regenstrief Institute in Indianapolis….(More)”.

Terms of Disservice: How Silicon Valley is Destructive by Design


Book by Dipayan Ghosh: “Designing a new digital social contact for our technological future…High technology presents a paradox. In just a few decades, it has transformed the world, making almost limitless quantities of information instantly available to billions of people and reshaping businesses, institutions, and even entire economies. But it also has come to rule our lives, addicting many of us to the march of megapixels across electronic screens both large and small.

Despite its undeniable value, technology is exacerbating deep social and political divisions in many societies. Elections influenced by fake news and unscrupulous hidden actors, the cyber-hacking of trusted national institutions, the vacuuming of private information by Silicon Valley behemoths, ongoing threats to vital infrastructure from terrorist groups and even foreign governments—all these concerns are now part of the daily news cycle and are certain to become increasingly serious into the future.

In this new world of endless technology, how can individuals, institutions, and governments harness its positive contributions while protecting each of us, no matter who or where we are?

In this book, a former Facebook public policy adviser who went on to assist President Obama in the White House offers practical ideas for using technology to create an open and accessible world that protects all consumers and civilians. As a computer scientist turned policymaker, Dipayan Ghosh answers the biggest questions about technology facing the world today. Proving clear and understandable explanations for complex issues, Terms of Disservice will guide industry leaders, policymakers, and the general public as we think about how we ensure that the Internet works for everyone, not just Silicon Valley….(More)”.

Coronavirus Compels Congress to Modernize Communication Techniques


Congressional Management Foundation: “The Future of Citizen Engagement: Coronavirus, Congress, and Constituent Communications” explores how Members of Congress and their staff engaged with citizens while navigating the constraints posed by COVID-19, and offers examples of how Congress can substantively connect with constituents using modern technology against the backdrop of a global pandemic.

The report addresses the following questions:

  • How did congressional offices adapt their communications strategies to meet the immediate needs of their constituents during the onset of COVID-19?
  • What techniques did Members use to diversify their constituent outreach?
  • What methods of engagement is Congress using now, and likely to use in the future?

The findings are based on a survey of senior congressional staffers, comprising over 120 responses provided to CMF between May 26 and June 19, 2020. Additionally, CMF conducted 13 follow-up interviews with survey respondents who indicated they were willing to speak further about their office operations and constituent communications during COVID-19….(More)”.

How Philanthropy Can Help Governments Accelerate a Real Recovery


Essay by Michele Jolin and David Medina: “The cracks and design flaws of our nation’s public systems have been starkly exposed as governments everywhere struggle to respond to health and economic crises that disproportionately devastate Black residents and communities of color. As government leaders respond to the immediate emergencies, they also operate within a legacy of government practices, policies and systems that have played a central role in creating and maintaining racial inequity. 

Philanthropy can play a unique and catalytic role in accelerating a real recovery by helping government leaders make smarter decisions, helping them develop and effectively use the data-and-evidence capacity they need to spotlight and understand root causes of community challenges, especially racial disparities, and increase the impact of government investments that could close racial gaps and accelerate economic opportunity. Philanthropy can uniquely support leaders within government who are best positioned to redesign and reimagine public systems to deliver equity and impact.

We are already seeing that the growing number of governments that have built data-driven “Moneyball” muscles are better positioned both to manage through this crisis and to dismantle racist government practices. While we recognize that data and evidence can sometimes reinforce biases, we also know that government decision-makers who have access to more and better information—and who are trained to navigate the nuance and possible bias in this information—can use data to identify disparate racial outcomes, understand the core problems and target resources to close gaps. Government decision-makers who have the skills to test, learn, and improve government programs can prioritize resource allocation toward programs that both deliver better results and address the complexity of social problems.

Philanthropy can accelerate this public sector transformation by supporting change led by internal government champions who are challenging the status quo. By doing so, philanthropic leaders can increase the impact of the trillions of dollars invested by governments each year. Philanthropies such as Ballmer Group, Bloomberg Philanthropies, Blue Meridian Partners, the Bill & Melinda Gates Foundation, and Arnold Ventures understand this and are already putting their money where their mouths are. By helping governments make smarter budget and policy decisions, they can ensure that public dollars flow toward solutions that make a meaningful, measurable difference on our biggest challenges, whether it’s increasing economic mobility, reducing racial disparities in health and other outcomes, or addressing racial bias in government systems.

We need other donors to join them in prioritizing this kind of systems change….(More)”.

Why Personal Data Is a National Security Issue


Article by Susan Ariel Aaronson: “…Concerns about the national security threat from personal data held by foreigners first emerged in 2013. Several U.S. entities, including Target, J.P. Morgan, and the U.S. Office of Personnel Management were hacked. Many attributed the hacking to Chinese entities. Administration officials concluded that the Chinese government could cross-reference legally obtained and hacked-data sets to reveal information about U.S. objectives and strategy. 

Personal data troves can also be cross-referenced to identify individuals, putting both personal security as well as national security at risk. Even U.S. firms pose a direct and indirect security threat to individuals and the nation because of their failure to adequately protect personal data. For example, Facebook has a disturbing history of sharing personal data without consent and allowing its clients to use that data to manipulate users. Some app designers have enabled functionality unnecessary for their software’s operation, while others, like Anomaly 6, embedded their software in mobile apps without the permission of users or firms. Other companies use personal data without user permission to create new products. Clearview AI scraped billions of images from major web services such as Facebook, Google, and YouTube, and sold these images to law enforcement agencies around the world. 

Firms can also inadvertently aggregate personal data and in so doing threaten national security. Strava, an athletes’ social network, released a heat map of its global users’ activities in 2018. Savvy analysts were able to use the heat map to reveal secret military bases and patrol routes. Chinese-owned data firms could be a threat to national security if they share data with the Chinese government. But the problem lies in the U.S.’s failure to adequately protect personal data and police the misuse of data collected without the permission of users….(More)”.

Journalists’ guide to COVID data


Guide by RTDNA: “Watch a press conference, turn on a newscast, or overhear just about any phone conversation these days and you’ll hear mayors discussing R values, reporters announcing new fatalities and separated families comparing COVID case rolling averages in their counties. As coronavirus resurges across the country, medical data is no longer just the purview of epidemiologists (though a quick glance at any social media comments section shows an unlikely simultaneous surge in the number of virology experts and statisticians).

Journalists reporting on COVID, however, have a particular obligation to understand the data, to add context and to acknowledge uncertainty when reporting the numbers.

“Journalism requires more than merely reporting remarks, claims or comments. Journalism verifies, provides relevant context, tells the rest of the story and acknowledges the absence of important additional information.” – RTDNA Code of Ethics

This guide to common COVID metrics is designed to help journalists know how each data point is calculated, what it means and, importantly, what it doesn’t mean….(More)”.

Genomic Epidemiology Data Infrastructure Needs for SARS-CoV-2


Report by the National Academies of Sciences, Engineering, and Medicine: “In December 2019, new cases of severe pneumonia were first detected in Wuhan, China, and the cause was determined to be a novel beta coronavirus related to the severe acute respiratory syndrome (SARS) coronavirus that emerged from a bat reservoir in 2002. Within six months, this new virus—SARS coronavirus 2 (SARS-CoV-2)—has spread worldwide, infecting at least 10 million people with an estimated 500,000 deaths. COVID-19, the disease caused by SARS-CoV-2, was declared a public health emergency of international concern on January 30, 2020 by the World Health Organization (WHO) and a pandemic on March 11, 2020. To date, there is no approved effective treatment or vaccine for COVID-19, and it continues to spread in many countries.

Genomic Epidemiology Data Infrastructure Needs for SARS-CoV-2: Modernizing Pandemic Response Strategies lays out a framework to define and describe the data needs for a system to track and correlate viral genome sequences with clinical and epidemiological data. Such a system would help ensure the integration of data on viral evolution with detection, diagnostic, and countermeasure efforts. This report also explores data collection mechanisms to ensure a representative global sample set of all relevant extant sequences and considers challenges and opportunities for coordination across existing domestic, global, and regional data sources….(More)”.

How the Administrative State Got to This Challenging Place


Essay by Peter Strauss: “This essay has been written to set the context for a future issue of Daedalus, the quarterly of the American Academy of Arts and Sciences, addressing the prospects of American administrative law in the Twenty-first Century. It recounts the growth of American government over the centuries since its founding, in response to the profound changes in the technology, economy, and scientific understandings it must deal with, under a Constitution written for the governance of a dispersed agrarian population operating with hand tools in a localized economy. It then suggests profound challenges of the present day facing administrative law’s development: the transition from processes of the paper age to those of the digital age; the steadily growing centralization of decision in an opaque, political presidency, displacing the focused knowledge and expertise of agencies Congress created to pursue particular governmental ends; the thickening, as well, of the political layer within agencies themselves, threatening similar displacements; and the revival in the courts of highly formalized analytic techniques inviting a return to the forms of government those who wrote the Constitution might themselves have imagined. The essay will not be published until months after the November election. While President Trump’s first term in office has sharply illustrated an imbalance in American governance between law and politics and law, reason and unreason, that imbalance is hardly new; it has been growing for decades. There lie the challenges….(More)”

What privacy preserving techniques make possible: for transport authorities


Blog by Georgina Bourke: “The Mayor of London listed cycling and walking as key population health indicators in the London Health Inequalities Strategy. The pandemic has only amplified the need for people to use cycling as a safer and healthier mode of transport. Yet as the majority of cyclists are white, Black communities are less likely to get the health benefits that cycling provides. Groups like Transport for London (TfL) should monitor how different communities cycle and who is excluded. Organisations like the London Office of Technology and Innovation (LOTI) could help boroughs procure privacy preserving technology to help their efforts.

But at the moment, it’s difficult for public organisations to access mobility data held by private companies. One reason is because mobility data is sensitive. Even if you remove identifiers like name and address, there’s still a risk you can reidentify someone by linking different data sets together. This means you could track how an individual moved around a city. I wrote more about the privacy risks with mobility data in a previous blog post. The industry’s awareness of privacy issues in using and sharing mobility data is rising. In the case of Los Angeles Department of Transport’s Mobility Data Specification (LADOT), Uber is concerned about sharing anonymised data because of the privacy risk. Both organisations are now involved in a legal battle to see which has the rights to the data. This might have been avoided if Uber had applied privacy preserving techniques….

Privacy preserving techniques can help mobility providers share important insights with authorities without compromising peoples’ privacy.

Instead of requiring access to all customer trip data, authorities could ask specific questions like, where are the least popular places to cycle? If mobility providers apply techniques like randomised response, an individual’s identity is obscured by the noise added to the data. This means it’s highly unlikely that someone could be reidentified later on. And because this technique requires authorities to ask very specific questions – for randomised response to work, the answer has to be binary, ie Yes or No – authorities will also be practicing data minimisation by default.

It’s easy to imagine transport authorities like TfL combining privacy preserved mobility data from multiple mobility providers to compare insights and measure service provision. They could cross reference the privacy preserved bike trip data with demographic data in the local area to learn how different communities cycle. The first step to addressing inequality is being able to measure it….(More)”.