Characterizing Disinformation Risk to Open Data in the Post-Truth Era


Paper by Adrienne Colborne and Michael Smit: “Curated, labeled, high-quality data is a valuable commodity for tasks such as business analytics and machine learning. Open data is a common source of such data—for example, retail analytics draws on open demographic data, and weather forecast systems draw on open atmospheric and ocean data. Open data is released openly by governments to achieve various objectives, such as transparency, informing citizen engagement, or supporting private enterprise.

Critical examination of ongoing social changes, including the post-truth phenomenon, suggests the quality, integrity, and authenticity of open data may be at risk. We introduce this risk through various lenses, describe some of the types of risk we expect using a threat model approach, identify approaches to mitigate each risk, and present real-world examples of cases where the risk has already caused harm. As an initial assessment of awareness of this disinformation risk, we compare our analysis to perspectives captured during open data stakeholder consultations in Canada…(More)”.

Race After Technology: Abolitionist Tools for the New Jim Code


Book by Ruha Benjamin: “From everyday apps to complex algorithms, Ruha Benjamin cuts through tech-industry hype to understand how emerging technologies can reinforce White supremacy and deepen social inequity.

Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent when compared to the racism of a previous era. Presenting the concept of the “New Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite. Moreover, she makes a compelling case for race itself as a kind of technology, designed to stratify and sanctify social injustice in the architecture of everyday life.

This illuminating guide provides conceptual tools for decoding tech promises with sociologically informed skepticism. In doing so, it challenges us to question not only the technologies we are sold but also the ones we ourselves manufacture….(More)”.

The technology of witnessing brutality


Axios: “The ways Americans capture and share records of racist violence and police misconduct keep changing, but the pain of the underlying injustices they chronicle remains a stubborn constant.

Driving the news: After George Floyd’s death at the hands of Minneapolis police sparked wide protests, Minnesota Gov. Tim Walz said, “Thank God a young person had a camera to video it.”

Why it matters: From news photography to TV broadcasts to camcorders to smartphones, improvements in the technology of witness over the past century mean we’re more instantly and viscerally aware of each new injustice.

  • But unless our growing power to collect and distribute evidence of injustice can drive actual social change, the awareness these technologies provide just ends up fueling frustration and despair.

For decades, still news photography was the primary channel through which the public became aware of incidents of racial injustice.

  • horrific 1930 photo of the lynching of J. Thomas Shipp and Abraham S. Smith, two black men in Marion, Indiana, brought the incident to national attention and inspired the song “Strange Fruit.” But the killers were never brought to justice.
  • Photos of the mutilated body of Emmett Till catalyzed a nationwide reaction to his 1955 lynching in Mississippi.

In the 1960s, television news footage brought scenes of police turning dogs and water cannons on peaceful civil rights protesters in Birmingham and Selma, Alabama into viewers’ living rooms.

  • The TV coverage was moving in both senses of the word.

In 1991, a camcorder tape shot by a Los Angeles plumber named George Holliday captured images of cops brutally beating Rodney King.

  • In the pre-internet era, it was only after the King tape was broadcast on TV that Americans could see it for themselves.

Over the past decade, smartphones have enabled witnesses and protesters to capture and distribute photos and videos of injustice quickly — sometimes, as it’s happening.

  • This power helped catalyze the Black Lives Matter movement beginning in 2013 and has played a growing role in broader public awareness of police brutality.

Between the lines: For a brief moment mid-decade, some hoped that the combination of a public well-supplied with video recording devices and requirements that police wear bodycams would introduce a new level of accountability to law enforcement.

The bottom line: Smartphones and social media deliver direct accounts of grief- and rage-inducing stories…(More)”.

Centering Racial Equity Throughout Data Integration


Toolkit by AISP: “Societal “progress” is often marked by the construction of new infrastructure that fuels change and innovation. Just as railroads and interstate highways were the defining infrastructure projects of the 1800 and 1900s, the development of data infrastructure is a critical innovation of our century. Railroads and highways were drivers of development and prosperity for some investors and sites. Yet other individuals and communities were harmed, displaced, bypassed, ignored, and forgotten by
those efforts.

At this moment in our history, we can co-create data infrastructure to promote racial equity and the public good, or we can invest in data infrastructure that disregards the historical, social, and political context—reinforcing racial inequity that continues to harm communities. Building data infrastructure without a racial equity lens and understanding of historical context will exacerbate existing inequalities along the lines of race, gender, class, and ability. Instead, we commit to contextualize our work in the historical and structural oppression that shapes it, and organize stakeholders across geography, sector, and experience to center racial equity throughout data integration….(More)”.

How Congress can improve productivity by looking to the rest of the world


Beth Noveck and Dane Gambrell at the Hill: “…While an important first step in helping to resume operations, Congress needs to follow the lead of those many legislatures around the world who have changed their laws and rules and are using technology to continue to legislate, conduct oversight and even innovate. 

Though efforts to restart by adopting proxy voting are a step in the right direction, they do not go far enough to create what Georgetown University’s Lorelei Kelly calls the “modern and safe digital infrastructure for the world’s most powerful national legislature.” 

Congress has all but shut down since March. While the Senate formally “re-opened” on May 4, the chamber is operating under restrictive new guidelines, with hearings largely closed to the public and lawmakers advised to bring only a skeleton crew to run their offices. Considering that the average age of a senator is 63 and the average age of a Member of the House is 58, this caution comes as no surprise.

Yet when we take into account that parliaments around the world from New Zealand to the Maldives are holding committee meetings, running plenary sessions, voting and even engaging the public in the lawmaking process online, we should be asking Congress to do more faster. 

Instead, bitter partisan wrangling — with Republicans accusing Democrats of taking advantage of social distancing to launch a power grab and Democrats accusing Republicans of failing to exercise oversight — is delaying the adoption of long available and easy to use technologies. More than a left-right issue, moving online is a top-down issue with leadership of both parties using the crisis to consolidate power.

Working online

The Parliament of the United Kingdom, for example, is one of dozens of legislatures turning to online video conferencing tools such as Zoom, Microsoft Teams, Cisco Web Meetings and Google Hangouts to do plenary or committee meetings. After 800 years, lawmakers in the House of Commons convened the first-ever “virtual Parliament” at the end of April. In this hybrid approach, some MPs were present in the legislative chamber while most joined remotely using Zoom…(More)”.

Considering the Source: Varieties of COVID-19 Information


Congressional Research Service: “In common parlance, the terms propaganda, misinformation, and disinformation are often used interchangeably, often with connotations of deliberate untruths of nefarious origin. In a national security context, however, these terms refer to categories of information that are created and disseminated with different intent and serve different strategic purposes. This primer examines these categories to create a framework for understanding the national security implications of information related to the Coronavirus Disease 2019 (COVID-19) pandemic….(More)”.

A call for a new generation of COVID-19 models


Blog post by Alex Engler: “Existing models have been valuable, but they were not designed to support these types of critical decisions. A new generation of models that estimate the risk of COVID-19 spread for precise geographies—at the county or even more localized level—would be much more informative for these questions. Rather than produce long-term predictions of deaths or hospital utilization, these models could estimate near-term relative risk to inform local policymaking. Going forward, governors and mayors need local, current, and actionable numbers.

Broadly speaking, better models would substantially aid in the “adaptive response” approach to re-opening the economy. In this strategy, policymakers cyclically loosen and re-tighten restrictions, attempting to work back towards a healthy economy without moving so fast as to allow infections to take off again. In an ideal process, restrictions would be eased at such a pace that balances a swift return to normalcy with reducing total COVID-19 infections. Of course, this is impossible in practice, and thus some continued adjustments—the flipping of various controls off and on again—will be necessary. More precise models can help improve this process, providing another lens into when it will be safe to relax restrictions, thus making it easier to do without a disruptive back-and-forth. A more-or-less continuous easing of restrictions is especially valuable, since it is unlikely that second or third rounds of interventions (such as social distancing) would achieve the same high rates of compliance as the first round.

The proliferation of Covid19 Data

These models can incorporate cases, test-positive rates, hospitalization information, deaths, excess deaths, and other known COVID-19 data. While all these data sources are incomplete, an expanding body of research on COVID-19 is making the data more interpretable. This research will become progressively more valuable with more data on the spread of COVID-19 in the U.S. rather than data from other countries or past pandemics.

Further, a broad range of non-COVID-19 data can also inform risk estimates: Population density, age distributions, poverty and uninsured rates, the number of essential frontline workers, and co-morbidity factors can also be included. Community mobility reports from Google and Unacast’s social distancing scorecard can identify how easing restrictions are changing behavior. Small area estimates also allow the models to account for the risk of spread from other nearby geographies. Geospatial statistics cannot account for infectious spread between two large neighboring states, but they would add value for adjacent zip codes. Lastly, many more data sources are in the works, like open patient data registries, the National Institutes of Health’s (NIH) study of asymptomatic personsself-reported symptoms data from Facebook, and (potentially) new randomized surveys. In fact, there are so many diverse and relevant data streams, that models can add value simply be consolidating daily information into just a few top-line numbers that are comparable across the nation.

FiveThirtyEight has effectively explained that making these models is tremendously difficult due to incomplete data, especially since the U.S. is not testing enough or in statistically valuable ways. These challenges are real, but decision-makers are currently using this same highly flawed data to make inferences and policy choices. Despite the many known problems, elected officials and public health services have no choice. Frequently, they are evaluating the data without the time and expertise to make reasoned statistical interpretations based on epidemiological research, leaving significant opportunity for modeling to help….(More)”.

National Academies, National Science Foundation Create Network to Connect Decision-Makers with Social Scientists on Pressing COVID-19 Questions


Press Release: “The National Academies of Sciences, Engineering, and Medicine and the National Science Foundation announced today the formation of a Societal Experts Action Network (SEAN) to connect social and behavioral science researchers with decision-makers who are leading the response to COVID-19. SEAN will respond to the most pressing social, behavioral, and economic questions that are being asked by federal, state, and local officials by working with appropriate experts to quickly provide actionable answers.

The new network’s activities will be overseen by an executive committee in coordination with the National Academies’ Standing Committee on Emerging Infectious Diseases and 21st Century Health Threats, established earlier this year to provide rapid expert input on urgent questions facing the federal government on the COVID-19 pandemic. Standing committee members Robert Groves, executive vice president and provost at Georgetown University, and Mary T. Bassett, director of the François-Xavier Bagnoud Center for Health and Human Rights at Harvard University, will co-chair the executive committee to manage SEAN’s solicitation of questions and expert responses, anticipate leaders’ research needs, and guide the dissemination of network findings.

SEAN will include individual researchers from a broad range of disciplines as well as leading national social and behavioral science institutions. Responses to decision-maker requests may range from individual phone calls and presentations to written committee documents such as Rapid Expert Consultations.

“This pandemic has broadly impacted all aspects of life — not just our health, but our work, families, education, supply chains, and even the global environment,” said Marcia McNutt, president of the National Academy of Sciences. “Therefore, to address the myriad questions that are being raised by mayors, governors, local representatives, and other leaders, we must recruit the full range of scientific expertise from across the social, natural, and biomedical sciences.”   

“Our communities and our society at large are facing a range of complex issues on multiple fronts due to COVID-19,” said Arthur Lupia, head of the Directorate for Social, Behavioral, and Economic Sciences at the National Science Foundation. “These are human-centered issues affecting our daily lives — the education and well-being of our children, the strength of our economy, the health of our loved ones, neighbors, and so many more. Through SEAN, social and behavioral scientists will provide actionable, evidence-driven guidance to our leaders across the U.S. who are working to support our communities and speed their recovery.”…(More)”.

Public Service and Good Governance for the Twenty-First Century


Book edited by James L. Perry: “Two big ideas serve as the catalyst for the essays collected in this book. The first is the state of governance in the United States, which Americans variously perceive as broken, frustrating, and unresponsive. Editor James Perry observes in his Introduction that this perception is rooted in three simultaneous developments: government’s failure to perform basic tasks that once were taken for granted, an accelerating pace of change that quickly makes past standards of performance antiquated, and a dearth of intellectual capital that generate the capacity to bridge the gulf between expectations and performance. The second idea hearkens back to the Progressive era, when Americans revealed themselves to be committed to better administration of their government at all levels—federal, state, and local.

These two ideas—the diminishing capacity for effective governance and Americans’ expectations for reform—are veering in opposite directions. Contributors to Public Service and Good Governance for the Twenty-First Century explore these central ideas by addressing such questions as: what is the state of government today? Can future disruptions of governance and public service be anticipated? What forms of government will emerge from the past and what institutions and structures will be needed to meet future challenges? And lastly, and perhaps most importantly, what knowledge, skills, and abilities will need to be fostered for tomorrow’s civil servants to lead and execute effectively?

Public Service and Good Governance for the Twenty-First Century offers recommendations for bending the trajectories of governance capacity and reform expectations toward convergence, including reversing the trend of administrative disinvestment, developing talent for public leadership through higher education, creating a federal civil service to meet future needs, and rebuilding bipartisanship so that the sweeping changes needed to restore good government become possible….(More)”

Data Sharing in the Context of Health-Related Citizen Science


Paper by Mary A. Majumder and Amy L. McGuire: “As citizen science expands, questions arise regarding the applicability of norms and policies created in the context of conventional science. This article focuses on data sharing in the conduct of health-related citizen science, asking whether citizen scientists have obligations to share data and publish findings on par with the obligations of professional scientists. We conclude that there are good reasons for supporting citizen scientists in sharing data and publishing findings, and we applaud recent efforts to facilitate data sharing. At the same time, we believe it is problematic to treat data sharing and publication as ethical requirements for citizen scientists, especially where there is the potential for burden and harm without compensating benefit…(More)”.