The Human Experience Will Not Be Quantified


 Phil Klay at the New York Times: “…Stories are a quintessentially human method of responding to the chaos and uncertainty of the world. Science is a quintessentially human method of trying to control that chaos, and data is its raw material. Adrift in the world, uncertain of the future, hostage to fate, but possessed of increasingly powerful tools for carving up pieces of the world and putting them under the microscope, is it any wonder that we increasingly turn to science when looking for deliverance from our human predicaments?

Science, after all, will eventually bring us to the end of the pandemic, just as it has helped limit the damage through better treatments and proof of the benefits of wearing masks. “Science over fiction,” was one slogan of the Joe Biden campaign, a welcome message to those who’d like public policy tethered more to reality than political fantasy.

But because science supposedly gives clear answers about everything from how to open schools in a pandemic to who will be elected president, we tend to rush to embrace it as a panacea. Some, like the popular podcaster and author Sam Harris, even think science can answer moral questions. Rarely does it occur to us how often the invocation of “science” is used to mask value judgments, or political deliberation.

When the Centers for Disease Control and Prevention and the American Academy of Pediatricians released separate guidelines for reopening schools, the difference lay not in the underlying science but in their institutional priorities, one focused on disease spread and the other on the welfare of children. Likewise, the difference in how New York City handled the reopenings of day cares and schools reflected not simply science, but also what could be more easily demanded of workers who lacked the protection of a powerful union.

As much as we’d like to believe in “science over fiction,” decisions in the real world require negotiating between what we think the data means, what human value we’d like to assign to it and what stories about it we can get others to accept. Data alone is not knowledge, and it is certainly not wisdom. It rarely says as much as we think it does.

Yet its allure is undeniable, persistent. As I watched the election returns on Tuesday and Wednesday, I did so with the sinking feeling that I’d been fooled again by the lure of data. Even though it looked like Biden could still win, it was clear that those hard numbers I’d been absorbing for weeks, based on fine -tuned methodologies, correcting for past mistakes, aggregated to minimize chances of error, hadn’t come close to reflecting reality. “You are literally working on an essay about the problems with relying too much on data,” my wife told me the morning after the election, “and yet you were so confident in the polls.”…(More)”

Your phone already tracks your location. Now that data could fight voter suppression


Article by Seth Rosenblatt: “Smartphone location data is a dream for marketers who want to know where you go and how long you spend there—and a privacy nightmare. But this kind of geolocation data could also be used to protect people’s voting rights on Election Day.

The newly founded nonprofit Center for New Data is now tracking voters at the polls using smartphone location data to help researchers understand how easy—or difficult—it is for people to vote in different places. Called the Observing Democracy project, the nonpartisan effort is making data on how far people have to travel to vote and how long they have to wait in line available in a privacy-friendly way so it can be used to craft election policies that ensure voting is accessible for everyone.

Election data has already fueled changes in various municipalities and states. A 66-page lawsuit filed by Fair Fight Action against the state of Georgia in the wake of Stacey Abrams’s narrow loss to Brian Kemp in the 2018 gubernatorial race relies heavily on data to back its assertions of unconstitutionally delayed and deferred voter registration, unfair challenges to absentee and provisional ballots, and unjustified purges of voter rolls—all hallmarks of voter suppression.

The promise of Observing Democracy is to make this type of impactful data available much more rapidly than ever before. Barely a month old, Observing Democracy isn’t wasting any time: Its all-volunteer staffers will be receiving data potentially as soon as Nov. 4 on voter wait times at polling locations, travel times to polling stations, and how frequently ballot drop-off boxes are visited, courtesy of location-data mining companies X-Mode Social and Veraset, which was spun off from SafeGraph….(More)”.

To mitigate the costs of future pandemics, establish a common data space


Article by Stephanie Chin and Caitlin Chin: “To improve data sharing during global public health crises, it is time to explore the establishment of a common data space for highly infectious diseases. Common data spaces integrate multiple data sources, enabling a more comprehensive analysis of data based on greater volume, range, and access. At its essence, a common data space is like a public library system, which has collections of different types of resources from books to video games; processes to integrate new resources and to borrow resources from other libraries; a catalog system to organize, sort, and search through resources; a library card system to manage users and authorization; and even curated collections or displays that highlight themes among resources.

Even before the COVID-19 pandemic, there was significant momentum to make critical data more widely accessible. In the United States, Title II of the Foundations for Evidence-Based Policymaking Act of 2018, or the OPEN Government Data Act, requires federal agencies to publish their information online as open data, using standardized, machine-readable data formats. This information is now available on the federal data.gov catalog and includes 50 state- or regional-level data hubs and 47 city- or county-level data hubs. In Europe, the European Commission released a data strategy in February 2020 that calls for common data spaces in nine sectors, including healthcare, shared by EU businesses and governments.

Going further, a common data space could help identify outbreaks and accelerate the development of new treatments by compiling line list incidence data, epidemiological information and models, genome and protein sequencing, testing protocols, results of clinical trials, passive environmental monitoring data, and more.

Moreover, it could foster a common understanding and consensus around the facts—a prerequisite to reach international buy-in on policies to address situations unique to COVID-19 or future pandemics, such as the distribution of medical equipment and PPE, disruption to the tourism industry and global supply chains, social distancing or quarantine, and mass closures of businesses….(More). See also Call for Action for a Data Infrastructure to tackle Pandemics and other Dynamic Threats.

How to Use the Bureaucracy to Govern Well


Good Governance Paper by Rebecca Ingber:”…Below I offer four concrete recommendations for deploying Intentional Bureaucratic Architecture within the executive branch. But first, I will establish three key background considerations that provide context for these recommendations.  The focus of this piece is primarily executive branch legal decisionmaking, but many of these recommendations apply equally to other areas of policymaking.

First, make room for the views and expertise of career officials. As a political appointee entering a new office, ask those career officials: What are the big issues on the horizon on which we will need to take policy or legal views?  What are the problems with the positions I am inheriting?  What is and is not working?  Where are the points of conflict with our allies abroad or with Congress?  Career officials are the institutional memory of the government and often the only real experts in the specific work of their agency.  They will know about the skeletons in the closet and where the bodies are buried and all the other metaphors for knowing things that other people do not. Turn to them early. Value them. They will have views informed by experience rather than partisan politics. But all bureaucratic actors, including civil servants, also bring to the table their own biases, and they may overvalue the priorities of their own office over others. Valuing their role does not mean handing the reins over to the civil service—good governance requires exercising judgement and balancing the benefits of experience and expertise with fresh eyes and leadership. A savvy bureaucratic actor might know how to “get around” the bureaucratic roadblocks, but the wise bureaucratic player also knows how much the career bureaucracy has to offer and exercises judgment based in clear values about when to defer and when to overrule.

Second, get ahead of decisions: choose vehicles for action carefully and early. The reality of government life is that much of the big decisionmaking happens in the face of a fire drill. As I’ve written elsewhere, the trigger or “interpretation catalyst” that compels the government to consider and assert a position—in other words, the cause of that fire drill—shapes the whole process of decisionmaking and the resulting decision. When an issue arises in defensive litigation, a litigation-driven process controls.  That means that career line attorneys shape the government’s legal posture, drawing from longstanding positions and often using language from old briefs. DOJ calls the shots in a context biased toward zealous defense of past action. That looks very different from a decisionmaking process that results from the president issuing an executive order or presidential memorandum, a White House official deciding to make a speech, the State Department filing a report with a treaty body, or DOD considering whether to engage in an operation involving force. Each of these interpretation catalysts triggers a different process for decisionmaking that will shape the resulting outcome.  But because of the stickiness of government decisions—and the urgent need to move on to the next fire drill—these positions become entrenched once taken. That means that the process and outcome are driven by the hazards of external events, unless officials find ways to take the reins and get ahead of them.

And finally, an incoming administration must put real effort into Intentional Bureaucratic Architecture by deliberately and deliberatively creating and managing the bureaucratic processes in which decisionmaking happens. Novel issues arise and fire drills will inevitably happen in even the best prepared administrations.  The bureaucratic architecture will dictate how decisionmaking happens from the novel crises to the bread and butter of daily agency work. There are countless varieties of decisionmaking models inside the executive branch, which I have classified in other work. These include a unitary decider model, of which DOJ’s Office of Legal Counsel (OLC) is a prime example, an agency decider model, and a group lawyering model. All of these models will continue to co-exist. Most modern national security decisionmaking engages the interests and operations of multiple agencies. Therefore, in a functional government, most of these decisions will involve group lawyering in some format—from agency lawyers picking up the phone to coordinate with counterparts in other agencies to ad hoc meetings to formal regularized working groups with clear hierarchies all the way up to the cabinet. Often these processes evolve organically, as issues arise. Some are created from the top down by presidential administrations that want to impose order on the process. But all of these group lawyering dynamics often lack a well-defined process for determining the outcome in cases of conflict or deciding how to establish a clear output. This requires rule setting and organizing the process from the top down….(More).

Tracking COVID-19: U.S. Public Health Surveillance and Data


CRS Report: “Public health surveillance, or ongoing data collection, is an essential part of public health practice. Particularly during a pandemic, timely data are important to understanding the epidemiology of a disease in order to craft policy and guide response decision making. Many aspects of public health surveillance—such as which data are collected and how—are often governed by law and policy at the state and sub federal level, though informed by programs and expertise at the Centers for Disease Control and Prevention (CDC). The Coronavirus Disease 2019 (COVID-19) pandemic has exposed limitations and challenges with U.S. public health surveillance, including those related to the timeliness, completeness, and accuracy of data.

This report provides an overview of U.S. public health surveillance, current COVID-19 surveillance and data collection, and selected policy issues that have been highlighted by the pandemic.Appendix B includes a compilation of selected COVID-19 data resources….(More)”.

Harnessing the wisdom of crowds can improve guideline compliance of antibiotic prescribers and support antimicrobial stewardship


Paper by Eva M. Krockow et al: “Antibiotic overprescribing is a global challenge contributing to rising levels of antibiotic resistance and mortality. We test a novel approach to antibiotic stewardship. Capitalising on the concept of “wisdom of crowds”, which states that a group’s collective judgement often outperforms the average individual, we test whether pooling treatment durations recommended by different prescribers can improve antibiotic prescribing. Using international survey data from 787 expert antibiotic prescribers, we run computer simulations to test the performance of the wisdom of crowds by comparing three data aggregation rules across different clinical cases and group sizes. We also identify patterns of prescribing bias in recommendations about antibiotic treatment durations to quantify current levels of overprescribing. Our results suggest that pooling the treatment recommendations (using the median) could improve guideline compliance in groups of three or more prescribers. Implications for antibiotic stewardship and the general improvement of medical decision making are discussed. Clinical applicability is likely to be greatest in the context of hospital ward rounds and larger, multidisciplinary team meetings, where complex patient cases are discussed and existing guidelines provide limited guidance….(More)

Challenging the Use of Algorithm-driven Decision-making in Benefits Determinations Affecting People with Disabilities


Paper by Lydia X. Z. Brown, Michelle Richardson, Ridhi Shetty, and Andrew Crawford: “Governments are increasingly turning to algorithms to determine whether and to what extent people should receive crucial benefits for programs like Medicaid, Medicare, unemployment, and Social Security Disability. Billed as a way to increase efficiency and root out fraud, these algorithm-driven decision-making tools are often implemented without much public debate and are incredibly difficult to understand once underway. Reports from people on the ground confirm that the tools are frequently reducing and denying benefits, often with unfair and inhumane results.

Benefits recipients are challenging these tools in court, arguing that flaws in the programs’ design or execution violate their due process rights, among other claims. These cases are some of the few active courtroom challenges to algorithm-driven decision-making, producing important precedent about people’s right to notice, explanation, and other procedural due process safeguards when algorithm-driven decisions are made about them. As the legal and policy world continues to recognize the outsized impact of algorithm-driven decision-making in various aspects of our lives, public benefits cases provide important insights into how such tools can operate; the risks of errors in design and execution; and the devastating human toll when tools are adopted without effective notice, input, oversight, and accountability. 

This report analyzes lawsuits that have been filed within the past 10 years arising from the use of algorithm-driven systems to assess people’s eligibility for, or the distribution of, public benefits. It identifies key insights from the various cases into what went wrong and analyzes the legal arguments that plaintiffs have used to challenge those systems in court. It draws on direct interviews with attorneys who have litigated these cases and plaintiffs who sought to vindicate their rights in court – in some instances suing not only for themselves, but on behalf of similarly situated people. The attorneys work in legal aid offices, civil rights litigation shops, law school clinics, and disability protection and advocacy offices. The cases cover a range of benefits issues and have netted mixed results.

People with disabilities experience disproportionate and particular harm because of unjust algorithm-driven decision-making, and we have attempted to center disabled people’s stories and cases in this paper. As disabled people fight for rights inside and outside the courtroom on a wide range of issues, we focus on litigation and highlight the major legal theories for challenging improper algorithm-driven benefit denials in the U.S. 

The good news is that in some cases, plaintiffs are successfully challenging improper adverse benefits decisions with Constitutional, statutory, and administrative claims. But like other forms of civil rights and impact litigation, the bad news is that relief can be temporary and is almost always delayed. Litigation must therefore work in tandem with the development of new processes driven by people who require access to public assistance and whose needs are centered in these processes. We hope this contribution informs not only the development of effective litigation, but a broader public conversation about the thoughtful design, use, and oversight of algorithm-driven decision-making systems….(More)”.

Open data in public libraries: Gauging activities and supporting ambitions


Paper by Kaitlin Fender Throgmorton, Bree Norlander and Carole L. Palmer: “As the open data movement grows, public libraries must assess if and how to invest resources in this new service area. This paper reports on a recent survey on open data in public libraries across Washington state, conducted by the Open Data Literacy project (ODL) in collaboration with the Washington State Library. Results document interests and activity in open data across small, medium, and large libraries in relation to traditional library services and priorities. Libraries are particularly active in open data through reference services and are beginning to release their own library data to the public. While capacity and resource challenges hinder progress for some, many libraries, large and small, are making progress on new initiatives, including strategic collaborations with local government agencies. Overall, the level and range of activity suggest that Washington state public libraries of all sizes recognize the value of open data for their communities, with a groundswell of libraries moving beyond ambition to action as they develop new services through evolution and innovation….(More)”.

A qualitative study of big data and the opioid epidemic: recommendations for data governance


Paper by Elizabeth A. Evans, Elizabeth Delorme, Karl Cyr & Daniel M. Goldstein: “The opioid epidemic has enabled rapid and unsurpassed use of big data on people with opioid use disorder to design initiatives to battle the public health crisis, generally without adequate input from impacted communities. Efforts informed by big data are saving lives, yielding significant benefits. Uses of big data may also undermine public trust in government and cause other unintended harms….

We conducted focus groups and interviews in 2019 with 39 big data stakeholders (gatekeepers, researchers, patient advocates) who had interest in or knowledge of the Public Health Data Warehouse maintained by the Massachusetts Department of Public Health.

Concerns regarding big data on opioid use are rooted in potential privacy infringements due to linkage of previously distinct data systems, increased profiling and surveillance capabilities, limitless lifespan, and lack of explicit informed consent. Also problematic is the inability of affected groups to control how big data are used, the potential of big data to increase stigmatization and discrimination of those affected despite data anonymization, and uses that ignore or perpetuate biases. Participants support big data processes that protect and respect patients and society, ensure justice, and foster patient and public trust in public institutions. Recommendations for ethical big data governance offer ways to narrow the big data divide (e.g., prioritize health equity, set off-limits topics/methods, recognize blind spots), enact shared data governance (e.g., establish community advisory boards), cultivate public trust and earn social license for big data uses (e.g., institute safeguards and other stewardship responsibilities, engage the public, communicate the greater good), and refocus ethical approaches.

Using big data to address the opioid epidemic poses ethical concerns which, if unaddressed, may undermine its benefits. Findings can inform guidelines on how to conduct ethical big data governance and in ways that protect and respect patients and society, ensure justice, and foster patient and public trust in public institutions….(More)”

Consumer Reports Study Finds Marketplace Demand for Privacy and Security


Press Release: “American consumers are increasingly concerned about privacy and data security when purchasing new products and services, which may be a competitive advantage to companies that take action towards these consumer values, a new Consumer Reports study finds. 

The new study, “Privacy Front and Center” from CR’s Digital Lab with support from Omidyar Network, looks at the commercial benefits for companies that differentiate their products based on privacy and data security. The study draws from a nationally representative CR survey of 5,085 adult U.S. residents conducted in February 2020, a meta-analysis of 25 years of public opinion studies, and a conjoint analysis that seeks to quantify how consumers weigh privacy and security in their hardware and software purchasing decisions. 

“This study shows that raising the standard for privacy and security is a win-win for consumers and the companies,” said Ben Moskowitz, the director of the Digital Lab at Consumer Reports. “Given the rapid proliferation of internet connected devices, the rise in data breaches and cyber attacks, and the demand from consumers for heightened privacy and security measures, there’s an undeniable business case for companies to invest in creating more private and secure products.” 

Here are some of the key findings from the study:

  • According to CR’s February 2020 nationally representative survey, 74% of consumers are at least moderately concerned about the privacy of their personal data.
  • Nearly all Americans (96%) agree that more should be done to ensure that companies protect the privacy of consumers.
  • A majority of smart product owners (62%) worry about potential loss of privacy when buying them for their home or family.
  • The privacy/security conscious consumer class seems to include more men and people of color.
  • Experiencing a data breach correlates with a higher willingness to pay for privacy, and 30% of Americans have experienced one.
  • Of the Android users who switched to iPhones, 32% indicated doing so because of Apple’s perceived privacy or security benefits relative to Android….(More)”.