Regulating Electronic Means to Fight the Spread of COVID-19


In Custodia Legis Library of Congress: “It appears that COVID-19 will not go away any time soon. As there is currently no known cure or vaccine against it, countries have to find other ways to prevent and mitigate the spread of this infectious disease. Many countries have turned to electronic measures to provide general information and advice on COVID-19, allow people to check symptoms, trace contacts and alert people who have been in proximity to an infected person, identify “hot spots,” and track compliance with confinement measures and stay-at-home orders.

The Global Legal Research Directorate (GLRD) of the Law Library of Congress recently completed research on the kind of electronic measures countries around the globe are employing to fight the spread of COVID-19 and their potential privacy and data protection implications. We are excited to share with you the report that resulted from this research, Regulating Electronic Means to Fight the Spread of COVID-19. The report covers 23 selected jurisdictions, namely ArgentinaAustraliaBrazilChinaEnglandFranceIcelandIndiaIranIsraelItalyJapanMexicoNorwayPortugalthe Russian FederationSouth AfricaSouth KoreaSpainTaiwanTurkeythe United Arab Emirates, and the European Union (EU).

The surveys found that dedicated coronavirus apps that are downloaded to an individual’s mobile phone (particularly contact tracing apps), the use of anonymized mobility data, and creating electronic databases were the most common electronic measures. Whereas the EU recommends the use of voluntary apps because of the “high degree of intrusiveness” of mandatory apps, some countries take a different approach and require installing an app for people who enter the country from abroad, people who return to work, or people who are ordered to quarantine.

However, these electronic measures also raise privacy and data protection concerns, in particular as they relate to sensitive health data. The surveys discuss the different approaches countries have taken to ensure compliance with privacy and data protection regulations, such as conducting rights impact assessments before the measures were deployed or having data protection agencies conduct an assessment after deployment.

The map below shows which jurisdictions have adopted COVID-19 contact tracing apps and the technologies they use.

Map shows COVID-19 contact tracing apps in selected jurisdictions. Created by Susan Taylor, Law Library of Congress, based on surveys in “Regulating Electronic Means to Fight the Spread of COVID-19” (Law Library of Congress, June 2020). This map does not cover other COVID-19 apps that use GPS/geolocation….(More)”.

Data is Dangerous: Comparing the Risks that the United States, Canada and Germany See in Data Troves


Paper by Susan Ariel Aaronson: “Data and national security have a complex relationship. Data is essential to national defense — to understanding and countering adversaries. Data underpins many modern military tools from drones to artificial intelligence. Moreover, to protect their citizens, governments collect lots of data about their constituents. Those same datasets are vulnerable to theft, hacking, and misuse. In 2013, the Department of Defense’s research arm (DARPA) funded a study examining if “ the availability of data provide a determined adversary with the tools necessary to inflict nation-state level damage. The results were not made public. Given the risks to the data of their citizens, defense officials should be vociferous advocates for interoperable data protection rules.

This policy brief uses case studies to show that inadequate governance of personal data can also undermine national security. The case studies represent diverse internet sectors affecting netizens differently. I do not address malware or disinformation, which are also issues of data governance, but have already been widely researched by other scholars. I illuminate how policymakers, technologists, and the public are/were unprepared for how inadequate governance spillovers affected national security. I then makes some specific recommendations about what we can do about this problem….(More)”.

The Computermen


Podcast Episode by Jill Lepore: “In 1966, just as the foundations of the Internet were being imagined, the federal government considered building a National Data Center. It would be a centralized federal facility to hold computer records from each federal agency, in the same way that the Library of Congress holds books and the National Archives holds manuscripts. Proponents argued that it would help regulate and compile the vast quantities of data the government was collecting. Quickly, though, fears about privacy, government conspiracies, and government ineptitude buried the idea. But now, that National Data Center looks like a missed opportunity to create rules about data and privacy before the Internet took off. And in the absence of government action, corporations have made those rules themselves….(More)”.

Wrongfully Accused by an Algorithm


Kashmir Hill at the New York Times: “In what may be the first known case of its kind, a faulty facial recognition match led to a Michigan man’s arrest for a crime he did not commit….

The Shinola shoplifting occurred in October 2018. Katherine Johnston, an investigator at Mackinac Partners, a loss prevention firm, reviewed the store’s surveillance video and sent a copy to the Detroit police, according to their report.

Five months later, in March 2019, Jennifer Coulson, a digital image examiner for the Michigan State Police, uploaded a “probe image” — a still from the video, showing the man in the Cardinals cap — to the state’s facial recognition database. The system would have mapped the man’s face and searched for similar ones in a collection of 49 million photos.

The state’s technology is supplied for $5.5 million by a company called DataWorks Plus. Founded in South Carolina in 2000, the company first offered mug shot management software, said Todd Pastorini, a general manager. In 2005, the firm began to expand the product, adding face recognition tools developed by outside vendors.

When one of these subcontractors develops an algorithm for recognizing faces, DataWorks attempts to judge its effectiveness by running searches using low-quality images of individuals it knows are present in a system. “We’ve tested a lot of garbage out there,” Mr. Pastorini said. These checks, he added, are not “scientific” — DataWorks does not formally measure the systems’ accuracy or bias.

“We’ve become a pseudo-expert in the technology,” Mr. Pastorini said.

In Michigan, the DataWorks software used by the state police incorporates components developed by the Japanese tech giant NEC and by Rank One Computing, based in Colorado, according to Mr. Pastorini and a state police spokeswoman. In 2019, algorithms from both companies were included in a federal study of over 100 facial recognition systems that found they were biased, falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces….(More)“.

Best Practices to Cover Ad Information Used for Research, Public Health, Law Enforcement & Other Uses


Press Release: “The Network Advertising Initiative (NAI) released privacy Best Practices for its members to follow if they use data collected for Tailored Advertising or Ad Delivery and Reporting for non-marketing purposes, such as sharing with research institutions, public health agencies, or law enforcement entities.

“Ad tech companies have data that can be a powerful resource for the public good if they follow this set of best practices for consumer privacy,” said Leigh Freund, NAI President and CEO. “During the COVID-19 pandemic, we’ve seen the opportunity for substantial public health benefits from sharing aggregate and de-identified location data.”

The NAI Code of Conduct – the industry’s premier self-regulatory framework for privacy, transparency, and consumer choice – covers data collected and used for Tailored Advertising or Ad Delivery and Reporting. The NAI Code has long addressed certain non-marketing uses of data collected for Tailored Advertising and Ad Delivery and Reporting by prohibiting any
eligibility uses of such data, including uses for credit, insurance, healthcare, and employment decisions.

The NAI has always firmly believed that data collected for advertising purposes should not have a negative effect on consumers in their daily lives. However, over the past year, novel data uses have been introduced, especially during the recent health crisis. In the case of opted-in data such as Precise Location Information, a company may determine a user would benefit from more detailed disclosure in a just-in-time notice about non-marketing uses of the data being collected….(More)”.

IRS Used Cellphone Location Data to Try to Find Suspects


Byron Tau at the Wall Street Journal: “The Internal Revenue Service attempted to identify and track potential criminal suspects by purchasing access to a commercial database that records the locations of millions of American cellphones.

The IRS Criminal Investigation unit, or IRS CI, had a subscription to access the data in 2017 and 2018, and the way it used the data was revealed last week in a briefing by IRS CI officials to Sen. Ron Wyden’s (D., Ore.) office. The briefing was described to The Wall Street Journal by an aide to the senator.

IRS CI officials told Mr. Wyden’s office that their lawyers had given verbal approval for the use of the database, which is sold by a Virginia-based government contractor called Venntel Inc. Venntel obtains anonymized location data from the marketing industry and resells it to governments. IRS CI added that it let its Venntel subscription lapse after it failed to locate any targets of interest during the year it paid for the service, according to Mr. Wyden’s aide.

Justin Cole, a spokesman for IRS CI, said it entered into a “limited contract with Venntel to test their services against the law enforcement requirements of our agency.” IRS CI pursues the most serious and flagrant violations of tax law, and it said it used the Venntel database in “significant money-laundering, cyber, drug and organized-crime cases.”

The episode demonstrates a growing law enforcement interest in reams of anonymized cellphone movement data collected by the marketing industry. Government entities can try to use the data to identify individuals—which in many cases isn’t difficult with such databases.

It also shows that data from the marketing industry can be used as an alternative to obtaining data from cellphone carriers, a process that requires a court order. Until 2018, prosecutors needed “reasonable grounds” to seek cell tower records from a carrier. In June 2018, the U.S. Supreme Court strengthened the requirement to show probable cause a crime has been committed before such data can be obtained from carriers….(More)”

Defining a ‘new normal’ for data privacy in the wake of COVID-19


Jack Dunn at IAPP: “…It is revealing that our relationship with privacy is amorphous and requires additional context in light of transformative technologies, new economic realities and public health emergencies. How can we reasonably evaluate the costs and benefits of Google or Facebook sharing location data with the federal government when it has been perfectly legal for Walgreen’s to share access to customer data with pharmaceutical advertisers? How does aggregating and anonymizing data safeguard privacy when a user’s personal data can be revealed through other data points?

The pandemic is only revealing that we’ve yet to reach a consensus on privacy norms that will come to define the digital age. 

This isn’t the first time that technology confounded notions of privacy and consumer protection. In fact, the constitutional right to privacy was born out of another public health crisis. Before 1965, 32 women per 100,000 live births died while giving birth. Similarly, 25 infants died per 100,000 live births. As a result, medical professionals and women’s rights advocates began arguing for greater access to birth control. When state legislatures sought to minimize access, birth control advocates filed lawsuits that eventually lead to the Supreme Court’s seminal case regarding the right to privacy, Griswold v. Connecticut.

Today, there is growing public concern over the way in which consumer data is used to consolidate economic gain among the few while steering public perception among the many — particularly at a time when privacy seems to be the price for ending public health emergencies.

But the COVID-19 outbreak is also highlighting how user data has the capacity to improve consumer well being and public health. While strict adherence to traditional notions of privacy may be ineffectual in a time of exponential technological growth, the history of our relationship to privacy and technology suggests regulatory policies can strike a balance between otherwise competing interests….(More)“.

Tech Firms Are Spying on You. In a Pandemic, Governments Say That’s OK.


Sam Schechner, Kirsten Grind and Patience Haggin at the Wall Street Journal: “While an undergraduate at the University of Virginia, Joshua Anton created an app to prevent users from drunk dialing, which he called Drunk Mode. He later began harvesting huge amounts of user data from smartphones to resell to advertisers.

Now Mr. Anton’s company, called X-Mode Social Inc., is one of a number of little-known location-tracking companies that are being deployed in the effort to reopen the country. State and local authorities wielding the power to decide when and how to reopen are leaning on these vendors for the data to underpin those critical judgment calls.

In California, Gov. Gavin Newsom’s office used data from Foursquare Labs Inc. to figure out if beaches were getting too crowded; when the state discovered they were, it tightened its rules. In Denver, the Tri-County Health Department is monitoring counties where the population on average tends to stray more than 330 feet from home, using data from Cuebiq Inc.

Researchers at the University of Texas in San Antonio are using movement data from a variety of companies, including the geolocation firm SafeGraph, to guide city officials there on the best strategies for getting residents back to work.

Many of the location-tracking firms, data brokers and other middlemen are part of the ad-tech industry, which has come under increasing fire in recent years for building what critics call a surveillance economy. Data for targeting ads at individuals, including location information, can also end up in the hands of law-enforcement agencies or political groups, often with limited disclosure to users. Privacy laws are cropping up in states including California, along with calls for federal privacy legislation like that in the European Union.

But some public-health authorities are setting aside those concerns to fight an unprecedented pandemic. Officials are desperate for all types of data to identify people potentially infected with the virus and to understand how they are behaving to predict potential hot spots—whether those people realize it or not…(More)”

IoT Security Is a Mess. Privacy ‘Nutrition’ Labels Could Help


Lily Hay Newman at Wired: “…Given that IoT security seems unlikely to magically improve anytime soon, researchers and regulators are rallying behind a new approach to managing IoT risk. Think of it as nutrition labels for embedded devices.

At the IEEE Symposium on Security & Privacy last month, researchers from Carnegie Mellon University presented a prototype security and privacy label they created based on interviews and surveys of people who own IoT devices, as well as privacy and security experts. They also published a tool for generating their labels. The idea is to shed light on a device’s security posture but also explain how it manages user data and what privacy controls it has. For example, the labels highlight whether a device can get security updates and how long a company has pledged to support it, as well as the types of sensors present, the data they collect, and whether the company shares that data with third parties.

“In an IoT setting, the amount of sensors and information you have about users is potentially invasive and ubiquitous,” says Yuvraj Agarwal, a networking and embedded systems researcher who worked on the project. “It’s like trying to fix a leaky bucket. So transparency is the most important part. This work shows and enumerates all the choices and factors for consumers.”

Nutrition labels on packaged foods have a certain amount of standardization around the world, but they’re still more opaque than they could be. And security and privacy issues are even less intuitive to most people than soluble and insoluble fiber. So the CMU researchers focused a lot of their efforts on making their IoT label as transparent and accessible as possible. To that end, they included both a primary and secondary layer to the label. The primary label is what would be printed on device boxes. To access the secondary label, you could follow a URL or scan a QR code to see more granular information about a device….(More)”.

Sharing Health Data and Biospecimens with Industry — A Principle-Driven, Practical Approach


Kayte Spector-Bagdady et al at the New England Journal of Medicine: “The advent of standardized electronic health records, sustainable biobanks, consumer-wellness applications, and advanced diagnostics has resulted in new health information repositories. As highlighted by the Covid-19 pandemic, these repositories create an opportunity for advancing health research by means of secondary use of data and biospecimens. Current regulations in this space give substantial discretion to individual organizations when it comes to sharing deidentified data and specimens. But some recent examples of health care institutions sharing individual-level data and specimens with companies have generated controversy. Academic medical centers are therefore both practically and ethically compelled to establish best practices for governing the sharing of such contributions with outside entities.1 We believe that the approach we have taken at Michigan Medicine could help inform the national conversation on this issue.

The Federal Policy for the Protection of Human Subjects offers some safeguards for research participants from whom data and specimens have been collected. For example, researchers must notify participants if commercial use of their specimens is a possibility. These regulations generally cover only federally funded work, however, and they don’t apply to deidentified data or specimens. Because participants value transparency regarding industry access to their data and biospecimens, our institution set out to create standards that would better reflect participants’ expectations and honor their trust. Using a principlist approach that balances beneficence and nonmaleficence, respect for persons, and justice, buttressed by recent analyses and findings regarding contributors’ preferences, Michigan Medicine established a formal process to guide our approach….(More)”.