New Orleans is using sentiment analysis on federal relief funding


Ryan Johnston at StateScoop: “New Orleans is using data and social-media analysis to gauge how residents want the city to spend $375 million in federal stimulus funding, while quelling concerns of corruption or misuse that still exist from the city’s Hurricane Katrina recovery, officials told StateScoop on Tuesday.

The city government is working with ZenCity, an Israeli data-analysis firm that trawls social media to better understand how residents feel about various issues, to research American Rescue Plan funding. New Orleans is set to receive $375 million in relief funding to stabilize its finances and, “directly address” the economic impact that the COVID-19 pandemic had on the city, said Liana Elliot, the city’s deputy chief of staff. But many residents of the city are still wary of how the city squandered its Federal Emergency Management Agency funding following the natural disaster in 2005.

That caution became apparent almost immediately in online discourse, said Eyal Feder-Levy, ZenCity’s chief executive.

“We saw within the data that conversations about city budgets online in New Orleans were five-times more frequent than normal following the ARPA stimulus funding announcement,” Feder-Levy told StateScoop.

Elliot said what she heard about the budget in public didn’t match the conversations she was having with her colleagues in city government. Residents, she said, had an expectation that the money would help them, rather than go to city agencies…(More)”.

Policy Impacts


About: “Over the past 50 years, researchers have made great strides in analyzing public policy. With better data and improved research methods, we know more than ever about the impacts of government spending.

But despite these advances, it remains surprisingly challenging to answer basic questions about which policies have been most effective.

The difficulty arises because methods for evaluating policy effectiveness are not standardized. This makes it challenging, if not impossible, to compare and contrast across different policies.

Policy Impacts seeks to promote a unified approach for policy evaluation. We seek to promote the Marginal Value of Public Funds, a standardized metric for policy evaluation. We have created the Policy Impacts library, a collaborative effort to track the returns to a wide range of government policies…(More).

Closing the Data Gap: How Cities Are Delivering Better Results for Residents


Report by The Monitor Institute by Deloitte: “Better services. Smarter and more efficient use of tax dollars. Greater transparency and civic engagement. These are the results from the data-driven transformation in city halls across the country. The movement that began in just a handful of cities six years ago has now spread far and wide. Hundreds of cities, both large and small and in every region of the country, have embraced a new approach to local governance. Moving beyond old practices based on precedent or instinct, city leaders and staff are instead using data to make more effective operational, programmatic, and policy decisions. And residents are reaping real benefits, from improved services to greater visibility into how their local government works…

  • Performance management: The percentage of cities monitoring and analyzing their progress toward key goals has more than doubled (from 30% to 75%)
  • Public engagement: The percentage of cities engaging with residents on a goal and communicating progress has more than tripled (from 19% to 70%)
  • Releasing data: The percentage of cities with a platform and process to release data to residents has more than tripled (from 18% to 67%)
  • Taking action: The percentage of cities modifying existing programs based on data analytics has more than doubled (from 28% to 61%).

The results: greater transparency around how and why decisions are made, more effective and efficient operations, and improved services. For example, 60% of city officials surveyed in the WWC network reported improved emergency response times, and 70% reported that their cities are systematically using data-informed decision-making to respond to the COVID-19 crisis. More than half of survey respondents also reported improving their use of data to make budget decisions, award city contracts and/or shift procurement dollars, and deliver city services more efficiently, effectively, and/or equitably.

This kind of progress builds residents’ trust in government, produces better outcomes, and reflects the broad culture shift underway in city governments across the country — demonstrating that an evidence-informed approach is possible for all U.S. cities. Today, more than 250 municipal governments across the country are changing how they do business and tackling local challenges by putting into place critical data infrastructure and/or improving data skills….(More)”.

The Diffusion of Disruptive Technologies


Paper by Nicholas Bloom, Tarek Alexander Hassan, Aakash Kalyani, Josh Lerner & Ahmed Tahoun: “We identify novel technologies using textual analysis of patents, job postings, and earnings calls. Our approach enables us to identify and document the diffusion of 29 disruptive technologies across firms and labor markets in the U.S. Five stylized facts emerge from our data. First, the locations where technologies are developed that later disrupt businesses are geographically highly concentrated, even more so than overall patenting. Second, as the technologies mature and the number of new jobs related to them grows, they gradually spread across space. While initial hiring is concentrated in high-skilled jobs, over time the mean skill level in new positions associated with the technologies declines, broadening the types of jobs that adopt a given technology. At the same time, the geographic diffusion of low-skilled positions is significantly faster than higher-skilled ones, so that the locations where initial discoveries were made retain their leading positions among high-paying positions for decades. Finally, these technology hubs are more likely to arise in areas with universities and high skilled labor pools….(More)”

Transparency’s AI Problem


Paper by Hannah Bloch-Wehba: “A consensus seems to be emerging that algorithmic governance is too opaque and ought to be made more accountable and transparent. But algorithmic governance underscores the limited capacity of transparency law—the Freedom of Information Act and its state equivalents—to promote accountability. Drawing on the critical literature on “open government,” this Essay shows that algorithmic governance reflects and amplifies systemic weaknesses in the transparency regime, including privatization, secrecy, private sector cooptation, and reactive disclosure. These deficiencies highlight the urgent need to reorient transparency and accountability law toward meaningful public engagement in ongoing oversight. This shift requires rethinking FOIA’s core commitment to public disclosure of agency records, exploring instead alternative ways to empower the public and to shed light on decisionmaking. The Essay argues that new approaches to transparency and accountability for algorithmic governance should be independent of private vendors, and ought to adequately represent the interests of affected individuals and communities. These considerations, of vital importance for the oversight of automated systems, also hold broader lessons for efforts to recraft open government obligations in the public interest….(More)”

Sovereignty and Data Localization


Paper by Emily Wu: “Data localization policies impose obligations on businesses to store and process data locally, rather than in servers located overseas. The adoption of data localization laws has been increasing, driven by the fear that a nation’s sovereignty will be threatened by their inability to exert full control over data stored outside their borders. This is particularly relevant to the US given its dominance in many areas of the digital ecosystem including artificial intelligence and cloud computing.

Unfortunately, data localization policies are causing more harm than good. They are ineffective at improving security, do little to simplify the regulatory landscape, and are causing economic harms to the markets where they are imposed. In order to move away from these policies, the fear of sovereignty dilution must be addressed by alternative means. This will be achieved most effectively by focusing on both technical concerns and value concerns.

To address technical concerns, the US should:

1. Enact a federal national privacy law to reduce the fears that foreign nations have about the power of US tech companies.

2. Mandate privacy and security frameworks by industry to demonstrate the importance that US industry places on privacy and security, recognizing it as fundamental to their business success.

3. Increase investment in cybersecurity to ensure that in a competitive market, the US has the best offering in both customer experience and security assurance

4. Expand multi-lateral agreements under CLOUD Act to help alleviate the concerns that data stored by US companies will be inaccessible to foreign governments in relevant to a criminal investigation…(More)”

Federal Statistical Needs for a National Advanced Industry and Technology Strategy


Position paper by Robert D. Atkinson: “With the rise of China and other robust economic competitors, the United States needs a more coherent national advanced technology strategy.1 Effectively crafting and implementing such a strategy requires the right kind of economic data. In part because of years of budget cuts to federal economic data agencies, coupled with a long-standing disregard of the need for sectoral and firm-level economic data to inform an industrial strategy, the federal government is severely lacking in the kinds of data needed.

Notwithstanding the hundreds of millions of dollars spent every year and the thousands of economists working for the federal government, the exact nature of the challenges to U.S. capabilities with regard to the competitiveness of America’s traded sectors is only weakly understood. At least since after the Great Depression, the federal government has never felt the need to develop strategic economic intelligence in order to fully understand the competitive position of its traded sectors or to help support overall economic productivity.2 Rather, most of the focus goes to understanding the ups and downs of the business cycle….

If the U.S. government is going to develop more effective policies to spur competitiveness, growth, and opportunity it will need to support better data collection. It should be able to understand the U.S. competitive position vis-à-vis other nations on key technologies and industries, as well as key strengths and weaknesses and where specific policies are needed.

Better data can also identify weaknesses in U.S. competitiveness that policy can address. For example, in the 1980s, studies conducted as part of the Census of Manufactures (studies that have long been discontinued) found many smaller firms lagging behind badly in costs and quality for reasons including inefficient work organization and obsolete machinery and equipment. End-product manufacturers bought parts and components from many of these smaller enterprises at prices higher than those paid by foreign-based firms with more efficient suppliers, contributing to the cost and quality disadvantages of U.S.-based manufacturers. Legislators heeded the findings in crafting what is now called the Manufacturing Extension Partnership, a program that, if too small in scale to have a significant impact on U.S. manufacturing overall, continues to provide meaningful assistance to thousands of companies each year.5

Moreover, as the federal government institutes more technology and industry policies and programs—as exemplified in the Senate U.S. Innovation and Competition Act—better data will be important to evaluate their effectiveness.

Finally, data are a key 21st century infrastructure. In a decentralized economy, good outcomes are possible only if organizations make good decisions—and that requires data, which, because of its public goods nature, is a quintessential role of government….(More)”.

Why We Should End the Data Economy


Essay by Carissa Véliz: “…The data economy undermines equality and fairness. You and your neighbor are no longer treated as equal citizens. You aren’t given an equal opportunity because you are treated differently on the basis of your data. The ads and content you have access to, the prices you pay for the same services, and even how long you wait when you call customer service depend on your data.

We are much better at collecting personal data than we are at keeping it safe. But personal data is a serious threat, and we shouldn’t be collecting it in the first place if we are incapable of keeping it safe. Using smartphone location data acquired from a data broker, reporters from The New York Times were able to track military officials with security clearances, powerful lawyers and their guests, and even the president of the United States (through the phone of someone believed to be a Secret Service agent).

Our current data economy is based on collecting as much personal data as possible, storing it indefinitely, and selling it to the highest bidder. Having so much sensitive data circulating freely is reckless. By designing our economy around surveillance, we are building a dangerous structure for social control that is at odds with freedom. In the surveillance society we are constructing, there is no such thing as under the radar. It shouldn’t be up to us to constantly opt out of data collection. The default matters, and the default should be no data collection…(More)”.

Virtual Juries


Paper by Valerie P. Hans: “The introduction of virtual or remote jury trials in response to the COVID-19 pandemic constitutes a remarkable natural experiment with one of our nation’s central democratic institutions. Although it is not a tightly controlled experimental study, real world experiences in this natural experiment offer some insights about how key features of trial by jury are affected by a virtual procedure. This article surveys the landscape of virtual jury trials. It examines the issues of jury representativeness, the adequacy of virtual jury selection, the quality of decision making, and the public’s access to jury trial proceedings. Many have expressed concern that the digital divide would negatively affect jury representativeness. Surprisingly, there is some preliminary evidence that suggests that virtual jury selection procedures lead to jury venires that are as diverse, if not more diverse, than pre-pandemic jury venires. Lawyers in a demonstration project reacted favorably to virtual voir dire when it was accompanied by expansive pretrial juror questionnaires and the opportunity to question prospective jurors. A number of courts provided public access by live streaming jury trials. How a virtual jury trial affects jurors’ interpretations of witness testimony, attorney arguments, and jury deliberation remain open questions….(More)”

Is there a role for consent in privacy?


Article by Robert Gellman: “After decades, we still talk about the role of notice and choice in privacy. Yet there seems to be broad recognition that notice and choice do nothing for the privacy of consumers. Some American businesses cling to notice and choice because they hate all the alternatives. Some legislators draft laws with elements of notice and choice, either because it’s easier to draft a law that way, because they don’t know any better or because they carry water for business.

For present purposes, I will talk about notice and choice generically as consent. Consent is a broader concept than choice, but the difference doesn’t matter for the point I want to make. How you frame consent is complex. There are many alternatives and many approaches. It’s not just a matter of opt-in or opt-out. While I’m discarding issues, I also want to acknowledge and set aside the eight basic Fair Information Practices. There is no notice and choice principle in FIPS, and FIPs are not specifically important here.

Until recently, my view was that consent in almost any form is pretty much death for consumer privacy. No matter how you structure it, websites and others will find a way to wheedle consent from consumers. Those who want to exploit consumer data will cajole, pressure, threaten, mystify, obscure, entice or otherwise coax consumers to agree.

Suddenly, I’m not as sure of my conclusion about consent. What changed my mind? There is a new data point from Apple’s App Tracking Transparency framework. Apple requires mobile application developers to obtain opt-in consent before serving targeted advertising via Apple’s Identifier for Advertisers. Early reports suggest consumers are saying “NO” in overwhelming numbers — overwhelming as in more than 90%.

It isn’t this strong consumer reaction that makes me think consent might possibly have a place. I want to highlight a different aspect of the Apple framework….(More)”.