Lawless Surveillance


Paper by Barry Friedman: “Here in the United States, policing agencies are engaging in mass collection of personal data, building a vast architecture of surveillance. License plate readers collect our location information. Mobile forensics data terminals suck in the contents of cell phones during traffic stops. CCTV maps our movements. Cheap storage means most of this is kept for long periods of time—sometimes into perpetuity. Artificial intelligence makes searching and mining the data a snap. For most of us whose data is collected, stored, and mined, there is no suspicion whatsoever of wrongdoing.

This growing network of surveillance is almost entirely unregulated. It is, in short, lawless. The Fourth Amendment touches almost none of it, either because what is captured occurs in public, and so is supposedly “knowingly exposed,” or because of doctrine that shields information collected from third parties. It is unregulated by statutes because legislative bodies—when they even know about these surveillance systems—see little profit in taking on the police.

In the face of growing concern over such surveillance, this Article argues there is a constitutional solution sitting in plain view. In virtually every other instance in which personal information is collected by the government, courts require that a sound regulatory scheme be in place before information collection occurs. The rulings on the mandatory nature of regulation are remarkably similar, no matter under which clause of the Constitution collection is challenged.

This Article excavates this enormous body of precedent and applies it to the problem of government mass data collection. It argues that before the government can engage in such surveillance, there must be a regulatory scheme in place. And by changing the default rule from allowing police to collect absent legislative prohibition, to banning collection until there is legislative action, legislatures will be compelled to act (or there will be no surveillance). The Article defines what a minimally-acceptable regulatory scheme for mass data collection must include, and shows how it can be grounded in the Constitution…(More)”.

All Democracy Is Global


Article by  Larry Diamond: “The world is mired in a deep, diffuse, and protracted democratic recession. According to Freedom House, 2021 was the 16th consecutive year in which more countries declined in freedom than gained. Tunisia, the sole democracy to emerge from the Arab Spring protests that began in 2010, is morphing into a dictatorship. In countries as diverse as Bangladesh, Hungary, and Turkey, elections have long ceased to be democratic. Autocrats in Algeria, Belarus, Ethiopia, Sudan, Turkey, and Zimbabwe have clung to power despite mounting public demands for democratization. In Africa, seven democracies have slid back into autocracy since 2015, including Benin and Burkina Faso.

Democracy is looking shaky even in countries that hold free and fair elections. In emerging-market behemoths such as Brazil, India, and Mexico, democratic institutions and norms are under attack. Brazilian President Jair Bolsonaro has made threats of an autogolpe (self-coup) and a possible return to military rule if he does not win reelection in October. Indian Prime Minister Narendra Modi has steadily chipped away at press freedoms, minority rights, judicial independence, the integrity of the civil service, and the autonomy of civil society. Mexican President Andrés Manuel López Obrador has attempted to silence critics and remove democratic checks and balances.

Democratic prospects have risen and fallen in decades past, but they now confront a formidable new problem: democracy is at risk in the very country that has traditionally been its most ardent champion. Over the past dozen years, the United States has experienced one of the biggest declines in political rights and civil liberties of any country measured by the Freedom House annual survey. The Economist now ranks the United States as a “flawed democracy” behind Spain, Costa Rica, and Chile. U.S. President Donald Trump deserves much of the blame: he abused presidential power on a scale unprecedented in U.S. history and, after being voted out of office, propagated the “Big Lie” of election fraud and incited the violent rioters who stormed the U.S. Capitol on January 6, 2021. But American democracy was in peril before Trump assumed office, with rising polarization exposing acute flaws in American democratic institutions. The Electoral College, the representational structure of the Senate, the Senate filibuster, the brazen gerrymandering of House districts, and lifetime appointments to the Supreme Court have all made it possible for a political minority to exert prolonged outsize influence.

Can a country in the throes of its own democratic decay do anything to arrest the broader global decline? For many, the answer is no…(More)”.

The case for lotteries as a tiebreaker of quality in research funding


Editorial at Nature: “Earlier this month, the British Academy, the United Kingdom’s national academy for humanities and social sciences, introduced an innovative process for awarding small research grants. The academy will use the equivalent of a lottery to decide between funding applications that its grant-review panels consider to be equal on other criteria, such as the quality of research methodology and study design.

Using randomization to decide between grant applications is relatively new, and the British Academy is in a small group of funders to trial it, led by the Volkswagen Foundation in Germany, the Austrian Science Fund and the Health Research Council of New Zealand. The Swiss National Science Foundation (SNSF) has arguably gone the furthest: it decided in late 2021 to use randomization in all tiebreaker cases across its entire grant portfolio of around 880 million Swiss francs (US$910 million).

Other funders should consider whether they should now follow in these footsteps. That’s because it is becoming clear that randomization is a fairer way to allocate grants when applications are too close to call, as a study from the Research on Research Institute in London shows (see go.nature.com/3s54tgw). Doing so would go some way to assuage concerns, especially in early-career researchers and those from historically marginalized communities, about the lack of fairness when grants are allocated using peer review.

The British Academy/Leverhulme small-grants scheme distributes around £1.5 million (US$1.7 million) each year in grants of up to £10,000 each. These are valuable despite their relatively small size, especially for researchers starting out. The academy’s grants can be used only for direct research expenses, but small grants are also typically used to fund conference travel or to purchase computer equipment or software. Funders also use them to spot promising research talent for future (or larger) schemes. For these reasons and more, small grants are competitive — the British Academy says it is able to fund only 20–30% of applications in each funding round…(More)”.

Learning to Share: Lessons on Data-Sharing from Beyond Social Media


Paper by CDT: “What role has social media played in society? Did it influence the rise of Trumpism in the U.S. and the passage of Brexit in the UK? What about the way authoritarians exercise power in India or China? Has social media undermined teenage mental health? What about its role in building social and community capital, promoting economic development, and so on?

To answer these and other important policy-related questions, researchers such as academics, journalists, and others need access to data from social media companies. However, this data is generally not available to researchers outside of social media companies and, where it is available, it is often insufficient, meaning that we are left with incomplete answers.

Governments on both sides of the Atlantic have passed or proposed legislation to address the problem by requiring social media companies to provide certain data to vetted researchers (Vogus, 2022a). Researchers themselves have thought a lot about the problem, including the specific types of data that can further public interest research, how researchers should be vetted, and the mechanisms companies can use to provide data (Vogus, 2022b).

For their part, social media companies have sanctioned some methods to share data to certain types of researchers through APIs (e.g., for researchers with university affiliations) and with certain limitations (such as limits on how much and what types of data are available). In general, these efforts have been insufficient. In part, this is due to legitimate concerns such as the need to protect user privacy or to avoid revealing company trade secrets.  But, in some cases, the lack of sharing is due to other factors such as lack of resources or knowledge about how to share data effectively or resistance to independent scrutiny.

The problem is complex but not intractable. In this report, we look to other industries where companies share data with researchers through different mechanisms while also addressing concerns around privacy. In doing so, our analysis contributes to current public and corporate discussions about how to safely and effectively share social media data with researchers. We review experiences based on the governance of clinical trials, electricity smart meters, and environmental impact data…(More)”

What competencies do public sector officials need to enhance national digital transformations?


Report by the Broadband Commission for Sustainable Development: “The Broadband Commission Working Group on AI Capacity Building has leveraged a multi-stakeholder leadership model to assess the critical capacity needs for public sector digital transformation, including from a developing country perspective. From interviews with policymakers, global and regional expert consultations and evaluation of current international practices, the Working Group has developed three competency domains and nine recommendations. The output is a competency framework for civil servants, spelling out the Artificial Intelligence and Digital Transformation Competencies needed today…(More)”

New WHO policy requires sharing of all research data


Press release: “Science and public health can benefit tremendously from sharing and reuse of health data. Sharing data allows us to have the fullest possible understanding of health challenges, to develop new solutions, and to make decisions using the best available evidence.

The Research for Health department has helped spearhead the launch of a new policy from the Science Division which covers all research undertaken by or with support from WHO. The goal is to make sure that all research data is shared equitably, ethically and efficiently. Through this policy, WHO indicates its commitment to transparency in order to reach the goal of one billion more people enjoying better health and well-being.

The WHO policy is accompanied by practical guidance to enable researchers to develop and implement a data management and sharing plan, before the research has even started. The guide provides advice on the technical, ethical and legal considerations to ensure that data, even patient data, can be shared for secondary analysis without compromising personal privacy.  Data sharing is now a requirement for research funding awarded by WHO and TDR. 

“We have seen the problems caused by the lack of data sharing on COVID-19,” said Dr. Soumya Swaminathan, WHO Chief Scientist. “When data related to research activities are shared ethically, equitably and efficiently, there are major gains for science and public health.”

The policy to share data from all research funded or conducted by WHO, and practical guidance to do so, can be found here…(More)”.

Using real-time indicators for economic decision-making in government: Lessons from the Covid-19 crisis in the UK


Paper by David Rosenfeld: “When the UK went into lockdown in mid-March 2020, government was faced with the dual challenge of managing the impact of closing down large parts of the economy and responding effectively to the pandemic. Policy-makers needed to make rapid decisions regarding, on the one hand, the extent of restrictions on movement and economic activity to limit the spread of the virus, and on the other, the amount of support that would be provided to individuals and businesses affected by the crisis. Traditional, official statistics, such as gross domestic product (GDP) or unemployment, which get released on a monthly basis and with a lag, could not be relied upon to monitor the situation and guide policy decisions.

In response, teams of data scientists and statisticians pivoted to develop alternative indicators, leading to an unprecedented amount of innovation in how statistics and data were used in government. This ranged from monitoring sewage water for signs of Covid-19 infection to the Office for National Statistics (ONS) developing a new range of ‘faster indicators’ of economic activity using online job vacancies and data on debit and credit card expenditure from the Clearing House Automated Payment System (CHAPS).

The ONS received generally positive reviews for its performance during the crisis (The Economist, 2022), in contrast to the 2008 financial crisis when policy-makers did not realise the extent of the recession until subsequent revisions to GDP estimates were made. Partly in response to this, the Independent Review of UK Economic Statistics (HM Treasury, 2016) recommended improvements to the use of administrative data and alternative indicators as well as to data science capability to exploit both the extra granularity and the timeliness of new data sources.

This paper reviews the elements that contributed to successes in using real-time data during the pandemic as well as the challenges faced during this period, with a view to distilling some lessons for future use in government. Section 2 provides an overview of real-time indicators (RTIs) and how they were used in the UK during the Covid-19 crisis. The next sections analyse the factors that underpinned the successes (or lack thereof) in using such indicators: section 3 addresses skills, section 4 infrastructure, and section 5 legal frameworks and processes. Section 6 concludes with a summary of the main lessons for governments that hope to make greater use of RTIs…(More)”.

‘Very Harmful’ Lack of Data Blunts U.S. Response to Outbreaks


Paper by Sharon LaFraniere: “After a middle-aged woman tested positive for Covid-19 in January at her workplace in Fairbanks, public health workers sought answers to questions vital to understanding how the virus was spreading in Alaska’s rugged interior.

The woman, they learned, had underlying conditions and had not been vaccinated. She had been hospitalized but had recovered. Alaska and many other states have routinely collected that kind of information about people who test positive for the virus. Part of the goal is to paint a detailed picture of how one of the worst scourges in American history evolves and continues to kill hundreds of people daily, despite determined efforts to stop it.

But most of the information about the Fairbanks woman — and tens of millions more infected Americans — remains effectively lost to state and federal epidemiologists. Decades of underinvestment in public health information systems has crippled efforts to understand the pandemic, stranding crucial data in incompatible data systems so outmoded that information often must be repeatedly typed in by hand. The data failure, a salient lesson of a pandemic that has killed more than one million Americans, will be expensive and time-consuming to fix….(More)”.

The precise cost in needless illness and death cannot be quantified. The nation’s comparatively low vaccination rate is clearly a major factor in why the United States has recorded the highest Covid death rate among large, wealthy nations. But federal experts are certain that the lack of comprehensive, timely data has also exacted a heavy toll.

“It has been very harmful to our response,” said Dr. Ashish K. Jha, who leads the White House effort to control the pandemic. “It’s made it much harder to respond quickly.”

Details of the Fairbanks woman’s case were scattered among multiple state databases, none of which connect easily to the others, much less to the Centers for Disease Control and Prevention, the federal agency in charge of tracking the virus. Nine months after she fell ill, her information was largely useless to epidemiologists because it was impossible to synthesize most of it with data on the roughly 300,000 other Alaskans and the 95 million-plus other Americans who have gotten Covid.

Towards an international data governance framework


Paper by Steve MacFeely et al: “The CCSA argued that a Global Data Compact (GDC) could provide a framework to ensure that data are safeguarded as a global public good and as a resource to achieve equitable and sustainable development. This compact, by promoting common objectives, would help avoid fragmentation where each country or region adopts their own approach to data collection, storage, and use. A coordinated approach would give individuals and enterprises confidence that data relevant to them carries protections and obligations no matter where they are collected or used…

The universal principles and standards should set out the elements of responsible and ethical handling and sharing of data and data products. The compact should also move beyond simply establishing ethical principles and create a global architecture that includes standards and incentives for compliance. Such an architecture could be the foundation for rethinking the data economy, promoting open data, encouraging data exchange, fostering innovation and facilitating international trade. It should build upon the existing canon of international human rights and other conventions, laws and treaties that set out useful principles and compliance mechanisms.

Such a compact will require a new type of global architecture. Modern data ecosystems are not controlled by states alone, so any Compact, Geneva Convention, Commons, or Bretton Woods type agreement will require a multitude of stakeholders and signatories – states, civil society, and the private sector at the very least. This would be very different to any international agreement that currently exists. Therefore, to support a GDC, a new global institution or platform may be needed to bring together the many data communities and ecosystems, that comprise not only national governments, private sector and civil society but also participants in specific fields, such as artificial intelligence, digital and IT services. Participants would maintain and update data standards, oversee accountability frameworks, and support mechanisms to facilitate the exchange and responsible use of data. The proposed Global Digital Compact which has been proposed as part of Our Common Agenda will also need to address the challenges of bringing many different constituencies together and may point the way…(More)”

Innovation in the Public Sector: Smarter States, Services and Citizens


Book by Fatih Demir: “The book discusses smart governments and innovation in the public sector. In hopes of arriving at a clear definition of innovation in the field of public administration, the volume provides a wide survey of global policies and practices, especially those aimed at reducing bureaucracy and using information-communication technologies in public service delivery. Chapters look at current applications across countries and multiple levels of government, from public innovation labs in the UK to AI in South Korea. Providing concrete examples of innovation culture at work in public institutions, this volume will be of use to researchers and students studying new public management, public service delivery, and innovation as well as practitioners and professionals working in various public agencies…(More)”.