The European Union-U.S. Data Privacy Framework


White House Fact Sheet: “Today, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.) directing the steps that the United States will take to implement the U.S. commitments under the European Union-U.S. Data Privacy Framework (EU-U.S. DPF) announced by President Biden and European Commission President von der Leyen in March of 2022. 

Transatlantic data flows are critical to enabling the $7.1 trillion EU-U.S. economic relationship.  The EU-U.S. DPF will restore an important legal basis for transatlantic data flows by addressing concerns that the Court of Justice of the European Union raised in striking down the prior EU-U.S. Privacy Shield framework as a valid data transfer mechanism under EU law. 

The Executive Order bolsters an already rigorous array of privacy and civil liberties safeguards for U.S. signals intelligence activities. It also creates an independent and binding mechanism enabling individuals in qualifying states and regional economic integration organizations, as designated under the E.O., to seek redress if they believe their personal data was collected through U.S. signals intelligence in a manner that violated applicable U.S. law.

U.S. and EU companies large and small across all sectors of the economy rely upon cross-border data flows to participate in the digital economy and expand economic opportunities. The EU-U.S. DPF represents the culmination of a joint effort by the United States and the European Commission to restore trust and stability to transatlantic data flows and reflects the strength of the enduring EU-U.S. relationship based on our shared values…(More)”.

Call it data liberation day: Patients can now access all their health records digitally  


Article by Casey Ross: “The American Revolution had July 4. The allies had D-Day. And now U.S. patients, held down for decades by information hoarders, can rally around a new turning point, October 6, 2022 — the day they got their health data back.

Under federal rules taking effect Thursday, health care organizations must give patients unfettered access to their full health records in digital format. No more long delays. No more fax machines. No more exorbitant charges for printed pages.

Just the data, please — now…The new federal rules — passed under the 21st Century Cures Act — are designed to shift the balance of power to ensure that patients can not only get their data, but also choose who else to share it with. It is the jumping-off point for a patient-mediated data economy that lets consumers in health care benefit from the fluidity they’ve had for decades in banking: they can move their information easily and electronically, and link their accounts to new services and software applications.

“To think that we actually have greater transparency about our personal finances than about our own health is quite an indictment,” said Isaac Kohane, a professor of biomedical informatics at Harvard Medical School. “This will go some distance toward reversing that.”

Even with the rules now in place, health data experts said change will not be fast or easy. Providers and other data holders — who have dug in their heels at every step  —  can still withhold information under certain exceptions. And many questions remain about protocols for sharing digital records, how to verify access rights, and even what it means to give patients all their data. Does that extend to every measurement in the ICU? Every log entry? Every email? And how will it all get standardized?…(More)”

Blueprint for an AI Bill of Rights


The White House: “…To advance President Biden’s vision, the White House Office of Science and Technology Policy has identified five principles that should guide the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence. The Blueprint for an AI Bill of Rights is a guide for a society that protects all people from these threats—and uses technologies in ways that reinforce our highest values. Responding to the experiences of the American public, and informed by insights from researchers, technologists, advocates, journalists, and policymakers, this framework is accompanied by From Principles to Practice—a handbook for anyone seeking to incorporate these protections into policy and practice, including detailed steps toward actualizing these principles in the technological design process. These principles help provide guidance whenever automated systems can meaningfully impact the public’s rights, opportunities, or access to critical needs.

  • Safe and Effective Systems
  • Data Privacy
  • Notice and Explanation
  • Algorithmic Discrimination Protections
  • Human Alternatives, Consideration, and Fallback…(More)”.

Big Data and Official Statistics


Paper by Katharine G. Abraham: “The infrastructure and methods for developed countries’ economic statistics, largely established in the mid-20th century, rest almost entirely on survey and administrative data. The increasing difficulty of obtaining survey responses threatens the sustainability of this model. Meanwhile, users of economic data are demanding ever more timely and granular information. “Big data” originally created for other purposes offer the promise of new approaches to the compilation of economic data. Drawing primarily on the U.S. experience, the paper considers the challenges to incorporating big data into the ongoing production of official economic statistics and provides examples of progress towards that goal to date. Beyond their value for the routine production of a standard set of official statistics, new sources of data create opportunities to respond more nimbly to emerging needs for information. The concluding section of the paper argues that national statistical offices should expand their mission to seize these opportunities…(More)”.

Working with AI: Real Stories of Human-Machine Collaboration


Book by Thomas H. Davenport and Steven M. Miller: “This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers.These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems…(More)”.

The Data Liberation Project 


About: “The Data Liberation Project is a new initiative I’m launching today to identify, obtain, reformat, clean, document, publish, and disseminate government datasets of public interest. Vast troves of government data are inaccessible to the people and communities who need them most. These datasets are inaccessible. The Process:

  • Identify: Through its own research, as well as through consultations with journalists, community groups, government-data experts, and others, the Data Liberation Project aims to identify a large number of datasets worth pursuing.
  • Obtain: The Data Liberation Project plans to use a wide range of methods to obtain the datasets, including via Freedom of Information Act requests, intervening in lawsuits, web-scraping, and advanced document parsing. To improve public knowledge about government data systems, the Data Liberation Project also files FOIA requests for essential metadata, such as database schemas, record layouts, data dictionaries, user guides, and glossaries.
  • Reformat: Many datasets are delivered to journalists and the public in difficult-to-use formats. Some may follow arcane conventions or require proprietary software to access, for instance. The Data Liberation Project will convert these datasets into open formats, and restructure them so that they can be more easily examined.
  • Clean: The Data Liberation Project will not alter the raw records it receives. But when the messiness of datasets inhibits their usefulness, the project will create secondary, “clean” versions of datasets that fix these problems.
  • Document: Datasets are meaningless without context, and practically useless without documentation. The Data Liberation Project will gather official documentation for each dataset into a central location. It will also fill observed gaps in the documentation through its own research, interviews, and analysis.
  • Disseminate: The Data Liberation Project will not expect reporters and other members of the public simply to stumble upon these datasets. Instead, it will reach out to the newsrooms and communities that stand to benefit most from the data. The project will host hands-on workshops, webinars, and other events to help others to understand and use the data.”…(More)”

Applications of an Analytic Framework on Using Public Opinion Data for Solving Intelligence Problems


Report by the National Academies of Sciences, Engineering, and Medicine: “Measuring and analyzing public opinion comes with tremendous challenges, as evidenced by recent struggles to predict election outcomes and to anticipate mass mobilizations. The National Academies of Sciences, Engineering, and Medicine publication Measurement and Analysis of Public Opinion: An Analytic Framework presents in-depth information from experts on how to collect and glean insights from public opinion data, particularly in conditions where contextual issues call for applying caveats to those data. The Analytic Framework is designed specifically to help intelligence community analysts apply insights from the social and behavioral sciences on state-of-the-art approaches to analyze public attitudes in non- Western populations. Sponsored by the intelligence community, the National Academies’ Board on Behavioral, Cognitive, and Sensory Sciences hosted a 2-day hybrid workshop on March 8–9, 2022, to present the Analytic Framework and to demonstrate its application across a series of hypothetical scenarios that might arise for an intelligence analyst tasked with summarizing public attitudes to inform a policy decision. Workshop participants explored cutting-edge methods for using large-scale data as well as cultural and ethical considerations for the collection and use of public opinion data. This publication summarizes the presentations and discussions of the workshop…(More)”.

Why Funders Should Go Meta


Paper by Stuart Buck & Anna Harvey: “We don’t mean the former Facebook. Rather, philanthropies should prefer to fund meta-issues—i.e., research and evaluation, along with efforts to improve research quality. In many cases, it would be far more impactful than what they are doing now.

This is true at two levels.

First, suppose you want to support a certain cause–economic development in Africa, or criminal justice reform in the US, etc. You could spend millions or even billions on that cause.

But let’s go meta: a force multiplier would be funding high-quality research on what works on those issues. If you invest significantly in social and behavioral science research, you might find innumerable ways to improve on the existing status quo of donations.

Instead of only helping the existing nonprofits who seek to address economic development or criminal justice reform, you’d be helping to figure out what works and what doesn’t. The result could be a much better set of investments for all donors.

Perhaps some of your initial ideas end up not working, when exhaustively researched. At worst, that’s a temporary embarrassment, but it’s actually all for the better—now you and others know to avoid wasting more money on those ideas. Perhaps some of your favored policies are indeed good ideas (e.g., vaccination), but don’t have anywhere near enough take-up by the affected populations. Social and behavioral science research (as in the Social Science Research Council’s Mercury Project) could help find cost-effective ways to solve that problem…(More)”.

Building the analytic capacity to support critical technology strategy


Paper by Erica R.H. Fuchs: “Existing federal agencies relevant to the science and technology enterprise are appropriately focused on their missions, but the U.S. lacks the intellectual foundations, data infrastructure, and analytics to identify opportunities where the value of investment across missions (e.g., national security, economic prosperity, social well-being) is greater than the sum of its parts.

The U.S. government lacks systematic mechanisms to assess the nation’s strengths, weaknesses, and opportunities in technology and to assess the long chain of suppliers involved in producing products critical to national missions.

Two examples where modern data and analytics—leveraging star interdisciplinary talent from across the nation—and a cross-mission approach could transform outcomes include 1) the difficulties the federal government had in facilitating the production and distribution of personal protective equipment in spring 2020, and 2) the lack of clarity about the causes and solutions to the semiconductor shortage. Going forward, the scale-up of electric vehicles promises similar challenges…

The critical technology analytics (CTA) would identify 1) how emerging technologies and institutional innovations could potentially transform timely situational awareness of U.S. and global technology capabilities, 2) opportunities for innovation to transform U.S. domestic and international challenges, and 3) win-win opportunities across national missions. The program would be strategic and forward-looking, conducting work on a timeline of months and years rather than days and weeks, and would seek to generalize lessons from individual cases to inform the data and analytics capabilities that the government needs to build to support cross-mission critical technology policy…(More)”.

Lawless Surveillance


Paper by Barry Friedman: “Here in the United States, policing agencies are engaging in mass collection of personal data, building a vast architecture of surveillance. License plate readers collect our location information. Mobile forensics data terminals suck in the contents of cell phones during traffic stops. CCTV maps our movements. Cheap storage means most of this is kept for long periods of time—sometimes into perpetuity. Artificial intelligence makes searching and mining the data a snap. For most of us whose data is collected, stored, and mined, there is no suspicion whatsoever of wrongdoing.

This growing network of surveillance is almost entirely unregulated. It is, in short, lawless. The Fourth Amendment touches almost none of it, either because what is captured occurs in public, and so is supposedly “knowingly exposed,” or because of doctrine that shields information collected from third parties. It is unregulated by statutes because legislative bodies—when they even know about these surveillance systems—see little profit in taking on the police.

In the face of growing concern over such surveillance, this Article argues there is a constitutional solution sitting in plain view. In virtually every other instance in which personal information is collected by the government, courts require that a sound regulatory scheme be in place before information collection occurs. The rulings on the mandatory nature of regulation are remarkably similar, no matter under which clause of the Constitution collection is challenged.

This Article excavates this enormous body of precedent and applies it to the problem of government mass data collection. It argues that before the government can engage in such surveillance, there must be a regulatory scheme in place. And by changing the default rule from allowing police to collect absent legislative prohibition, to banning collection until there is legislative action, legislatures will be compelled to act (or there will be no surveillance). The Article defines what a minimally-acceptable regulatory scheme for mass data collection must include, and shows how it can be grounded in the Constitution…(More)”.