Luxury Surveillance


Essay by Chris Gilliard and David Golumbia: One of the most troubling features of the digital revolution is that some people pay to subject themselves to surveillance that others are forced to endure and would, if anything, pay to be free of.

Consider a GPS tracker you can wear around one of your arms or legs. Make it sleek and cool — think the Apple Watch or FitBit —  and some will pay hundreds or even thousands of dollars for the privilege of wearing it. Make it bulky and obtrusive, and others, as a condition of release from jail or prison, being on probation, or awaiting an immigration hearing, will be forced to wear one — and forced to pay for it too.

In each case, the device collects intimate and detailed biometric information about its wearer and uploads that data to servers, communities, and repositories. To the providers of the devices, this data and the subsequent processing of it are the main reasons the devices exist. They are means of extraction: That data enables further study, prediction, and control of human beings and populations. While some providers certainly profit from the sale of devices, this secondary market for behavioral control and prediction is where the real money is — the heart of what Shoshana Zuboff rightly calls surveillance capitalism.

The formerly incarcerated person knows that their ankle monitor exists for that purpose: to predict and control their behavior. But the Apple Watch wearer likely thinks about it little, if at all — despite the fact that the watch has the potential to collect and analyze much more data about its user (e.g. health metrics like blood pressure, blood glucose levels, ECG data) than parole or probation officers are even allowed to gather about their “clients” without specific warrant. Fitness-tracker wearers are effectively putting themselves on parole and paying for the privilege.

Both the Apple Watch and the FitBit can be understood as examples of luxury surveillance: surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits they are likely to celebrate. Google, which has recently acquired FitBit, is seemingly leaning into the category, launching a more expensive version of the device named the “Luxe.” Only certain people can afford luxury surveillance, but that is not necessarily a matter of money: In general terms, consumers of luxury surveillance see themselves as powerful and sovereign, and perhaps even immune from unwelcome monitoring and control. They see self-quantification and tracking not as disciplinary or coercive, but as a kind of care or empowerment. They understand it as something extra, something “smart.”…(More)”.

New Orleans is using sentiment analysis on federal relief funding


Ryan Johnston at StateScoop: “New Orleans is using data and social-media analysis to gauge how residents want the city to spend $375 million in federal stimulus funding, while quelling concerns of corruption or misuse that still exist from the city’s Hurricane Katrina recovery, officials told StateScoop on Tuesday.

The city government is working with ZenCity, an Israeli data-analysis firm that trawls social media to better understand how residents feel about various issues, to research American Rescue Plan funding. New Orleans is set to receive $375 million in relief funding to stabilize its finances and, “directly address” the economic impact that the COVID-19 pandemic had on the city, said Liana Elliot, the city’s deputy chief of staff. But many residents of the city are still wary of how the city squandered its Federal Emergency Management Agency funding following the natural disaster in 2005.

That caution became apparent almost immediately in online discourse, said Eyal Feder-Levy, ZenCity’s chief executive.

“We saw within the data that conversations about city budgets online in New Orleans were five-times more frequent than normal following the ARPA stimulus funding announcement,” Feder-Levy told StateScoop.

Elliot said what she heard about the budget in public didn’t match the conversations she was having with her colleagues in city government. Residents, she said, had an expectation that the money would help them, rather than go to city agencies…(More)”.

Policy Impacts


About: “Over the past 50 years, researchers have made great strides in analyzing public policy. With better data and improved research methods, we know more than ever about the impacts of government spending.

But despite these advances, it remains surprisingly challenging to answer basic questions about which policies have been most effective.

The difficulty arises because methods for evaluating policy effectiveness are not standardized. This makes it challenging, if not impossible, to compare and contrast across different policies.

Policy Impacts seeks to promote a unified approach for policy evaluation. We seek to promote the Marginal Value of Public Funds, a standardized metric for policy evaluation. We have created the Policy Impacts library, a collaborative effort to track the returns to a wide range of government policies…(More).

Closing the Data Gap: How Cities Are Delivering Better Results for Residents


Report by The Monitor Institute by Deloitte: “Better services. Smarter and more efficient use of tax dollars. Greater transparency and civic engagement. These are the results from the data-driven transformation in city halls across the country. The movement that began in just a handful of cities six years ago has now spread far and wide. Hundreds of cities, both large and small and in every region of the country, have embraced a new approach to local governance. Moving beyond old practices based on precedent or instinct, city leaders and staff are instead using data to make more effective operational, programmatic, and policy decisions. And residents are reaping real benefits, from improved services to greater visibility into how their local government works…

  • Performance management: The percentage of cities monitoring and analyzing their progress toward key goals has more than doubled (from 30% to 75%)
  • Public engagement: The percentage of cities engaging with residents on a goal and communicating progress has more than tripled (from 19% to 70%)
  • Releasing data: The percentage of cities with a platform and process to release data to residents has more than tripled (from 18% to 67%)
  • Taking action: The percentage of cities modifying existing programs based on data analytics has more than doubled (from 28% to 61%).

The results: greater transparency around how and why decisions are made, more effective and efficient operations, and improved services. For example, 60% of city officials surveyed in the WWC network reported improved emergency response times, and 70% reported that their cities are systematically using data-informed decision-making to respond to the COVID-19 crisis. More than half of survey respondents also reported improving their use of data to make budget decisions, award city contracts and/or shift procurement dollars, and deliver city services more efficiently, effectively, and/or equitably.

This kind of progress builds residents’ trust in government, produces better outcomes, and reflects the broad culture shift underway in city governments across the country — demonstrating that an evidence-informed approach is possible for all U.S. cities. Today, more than 250 municipal governments across the country are changing how they do business and tackling local challenges by putting into place critical data infrastructure and/or improving data skills….(More)”.

The Diffusion of Disruptive Technologies


Paper by Nicholas Bloom, Tarek Alexander Hassan, Aakash Kalyani, Josh Lerner & Ahmed Tahoun: “We identify novel technologies using textual analysis of patents, job postings, and earnings calls. Our approach enables us to identify and document the diffusion of 29 disruptive technologies across firms and labor markets in the U.S. Five stylized facts emerge from our data. First, the locations where technologies are developed that later disrupt businesses are geographically highly concentrated, even more so than overall patenting. Second, as the technologies mature and the number of new jobs related to them grows, they gradually spread across space. While initial hiring is concentrated in high-skilled jobs, over time the mean skill level in new positions associated with the technologies declines, broadening the types of jobs that adopt a given technology. At the same time, the geographic diffusion of low-skilled positions is significantly faster than higher-skilled ones, so that the locations where initial discoveries were made retain their leading positions among high-paying positions for decades. Finally, these technology hubs are more likely to arise in areas with universities and high skilled labor pools….(More)”

Transparency’s AI Problem


Paper by Hannah Bloch-Wehba: “A consensus seems to be emerging that algorithmic governance is too opaque and ought to be made more accountable and transparent. But algorithmic governance underscores the limited capacity of transparency law—the Freedom of Information Act and its state equivalents—to promote accountability. Drawing on the critical literature on “open government,” this Essay shows that algorithmic governance reflects and amplifies systemic weaknesses in the transparency regime, including privatization, secrecy, private sector cooptation, and reactive disclosure. These deficiencies highlight the urgent need to reorient transparency and accountability law toward meaningful public engagement in ongoing oversight. This shift requires rethinking FOIA’s core commitment to public disclosure of agency records, exploring instead alternative ways to empower the public and to shed light on decisionmaking. The Essay argues that new approaches to transparency and accountability for algorithmic governance should be independent of private vendors, and ought to adequately represent the interests of affected individuals and communities. These considerations, of vital importance for the oversight of automated systems, also hold broader lessons for efforts to recraft open government obligations in the public interest….(More)”

Sovereignty and Data Localization


Paper by Emily Wu: “Data localization policies impose obligations on businesses to store and process data locally, rather than in servers located overseas. The adoption of data localization laws has been increasing, driven by the fear that a nation’s sovereignty will be threatened by their inability to exert full control over data stored outside their borders. This is particularly relevant to the US given its dominance in many areas of the digital ecosystem including artificial intelligence and cloud computing.

Unfortunately, data localization policies are causing more harm than good. They are ineffective at improving security, do little to simplify the regulatory landscape, and are causing economic harms to the markets where they are imposed. In order to move away from these policies, the fear of sovereignty dilution must be addressed by alternative means. This will be achieved most effectively by focusing on both technical concerns and value concerns.

To address technical concerns, the US should:

1. Enact a federal national privacy law to reduce the fears that foreign nations have about the power of US tech companies.

2. Mandate privacy and security frameworks by industry to demonstrate the importance that US industry places on privacy and security, recognizing it as fundamental to their business success.

3. Increase investment in cybersecurity to ensure that in a competitive market, the US has the best offering in both customer experience and security assurance

4. Expand multi-lateral agreements under CLOUD Act to help alleviate the concerns that data stored by US companies will be inaccessible to foreign governments in relevant to a criminal investigation…(More)”

Federal Statistical Needs for a National Advanced Industry and Technology Strategy


Position paper by Robert D. Atkinson: “With the rise of China and other robust economic competitors, the United States needs a more coherent national advanced technology strategy.1 Effectively crafting and implementing such a strategy requires the right kind of economic data. In part because of years of budget cuts to federal economic data agencies, coupled with a long-standing disregard of the need for sectoral and firm-level economic data to inform an industrial strategy, the federal government is severely lacking in the kinds of data needed.

Notwithstanding the hundreds of millions of dollars spent every year and the thousands of economists working for the federal government, the exact nature of the challenges to U.S. capabilities with regard to the competitiveness of America’s traded sectors is only weakly understood. At least since after the Great Depression, the federal government has never felt the need to develop strategic economic intelligence in order to fully understand the competitive position of its traded sectors or to help support overall economic productivity.2 Rather, most of the focus goes to understanding the ups and downs of the business cycle….

If the U.S. government is going to develop more effective policies to spur competitiveness, growth, and opportunity it will need to support better data collection. It should be able to understand the U.S. competitive position vis-à-vis other nations on key technologies and industries, as well as key strengths and weaknesses and where specific policies are needed.

Better data can also identify weaknesses in U.S. competitiveness that policy can address. For example, in the 1980s, studies conducted as part of the Census of Manufactures (studies that have long been discontinued) found many smaller firms lagging behind badly in costs and quality for reasons including inefficient work organization and obsolete machinery and equipment. End-product manufacturers bought parts and components from many of these smaller enterprises at prices higher than those paid by foreign-based firms with more efficient suppliers, contributing to the cost and quality disadvantages of U.S.-based manufacturers. Legislators heeded the findings in crafting what is now called the Manufacturing Extension Partnership, a program that, if too small in scale to have a significant impact on U.S. manufacturing overall, continues to provide meaningful assistance to thousands of companies each year.5

Moreover, as the federal government institutes more technology and industry policies and programs—as exemplified in the Senate U.S. Innovation and Competition Act—better data will be important to evaluate their effectiveness.

Finally, data are a key 21st century infrastructure. In a decentralized economy, good outcomes are possible only if organizations make good decisions—and that requires data, which, because of its public goods nature, is a quintessential role of government….(More)”.

Why We Should End the Data Economy


Essay by Carissa Véliz: “…The data economy undermines equality and fairness. You and your neighbor are no longer treated as equal citizens. You aren’t given an equal opportunity because you are treated differently on the basis of your data. The ads and content you have access to, the prices you pay for the same services, and even how long you wait when you call customer service depend on your data.

We are much better at collecting personal data than we are at keeping it safe. But personal data is a serious threat, and we shouldn’t be collecting it in the first place if we are incapable of keeping it safe. Using smartphone location data acquired from a data broker, reporters from The New York Times were able to track military officials with security clearances, powerful lawyers and their guests, and even the president of the United States (through the phone of someone believed to be a Secret Service agent).

Our current data economy is based on collecting as much personal data as possible, storing it indefinitely, and selling it to the highest bidder. Having so much sensitive data circulating freely is reckless. By designing our economy around surveillance, we are building a dangerous structure for social control that is at odds with freedom. In the surveillance society we are constructing, there is no such thing as under the radar. It shouldn’t be up to us to constantly opt out of data collection. The default matters, and the default should be no data collection…(More)”.

Virtual Juries


Paper by Valerie P. Hans: “The introduction of virtual or remote jury trials in response to the COVID-19 pandemic constitutes a remarkable natural experiment with one of our nation’s central democratic institutions. Although it is not a tightly controlled experimental study, real world experiences in this natural experiment offer some insights about how key features of trial by jury are affected by a virtual procedure. This article surveys the landscape of virtual jury trials. It examines the issues of jury representativeness, the adequacy of virtual jury selection, the quality of decision making, and the public’s access to jury trial proceedings. Many have expressed concern that the digital divide would negatively affect jury representativeness. Surprisingly, there is some preliminary evidence that suggests that virtual jury selection procedures lead to jury venires that are as diverse, if not more diverse, than pre-pandemic jury venires. Lawyers in a demonstration project reacted favorably to virtual voir dire when it was accompanied by expansive pretrial juror questionnaires and the opportunity to question prospective jurors. A number of courts provided public access by live streaming jury trials. How a virtual jury trial affects jurors’ interpretations of witness testimony, attorney arguments, and jury deliberation remain open questions….(More)”