Book by Thomas H. Davenport and Steven M. Miller: “This book breaks through both the hype and the doom-and-gloom surrounding automation and the deployment of artificial intelligence-enabled—“smart”—systems at work. Management and technology experts Thomas Davenport and Steven Miller show that, contrary to widespread predictions, prescriptions, and denunciations, AI is not primarily a job destroyer. Rather, AI changes the way we work—by taking over some tasks but not entire jobs, freeing people to do other, more important and more challenging work. By offering detailed, real-world case studies of AI-augmented jobs in settings that range from finance to the factory floor, Davenport and Miller also show that AI in the workplace is not the stuff of futuristic speculation. It is happening now to many companies and workers.These cases include a digital system for life insurance underwriting that analyzes applications and third-party data in real time, allowing human underwriters to focus on more complex cases; an intelligent telemedicine platform with a chat-based interface; a machine learning-system that identifies impending train maintenance issues by analyzing diesel fuel samples; and Flippy, a robotic assistant for fast food preparation. For each one, Davenport and Miller describe in detail the work context for the system, interviewing job incumbents, managers, and technology vendors. Short “insight” chapters draw out common themes and consider the implications of human collaboration with smart systems…(More)”.
The Data Liberation Project
About: “The Data Liberation Project is a new initiative I’m launching today to identify, obtain, reformat, clean, document, publish, and disseminate government datasets of public interest. Vast troves of government data are inaccessible to the people and communities who need them most. These datasets are inaccessible. The Process:
- Identify: Through its own research, as well as through consultations with journalists, community groups, government-data experts, and others, the Data Liberation Project aims to identify a large number of datasets worth pursuing.
- Obtain: The Data Liberation Project plans to use a wide range of methods to obtain the datasets, including via Freedom of Information Act requests, intervening in lawsuits, web-scraping, and advanced document parsing. To improve public knowledge about government data systems, the Data Liberation Project also files FOIA requests for essential metadata, such as database schemas, record layouts, data dictionaries, user guides, and glossaries.
- Reformat: Many datasets are delivered to journalists and the public in difficult-to-use formats. Some may follow arcane conventions or require proprietary software to access, for instance. The Data Liberation Project will convert these datasets into open formats, and restructure them so that they can be more easily examined.
- Clean: The Data Liberation Project will not alter the raw records it receives. But when the messiness of datasets inhibits their usefulness, the project will create secondary, “clean” versions of datasets that fix these problems.
- Document: Datasets are meaningless without context, and practically useless without documentation. The Data Liberation Project will gather official documentation for each dataset into a central location. It will also fill observed gaps in the documentation through its own research, interviews, and analysis.
- Disseminate: The Data Liberation Project will not expect reporters and other members of the public simply to stumble upon these datasets. Instead, it will reach out to the newsrooms and communities that stand to benefit most from the data. The project will host hands-on workshops, webinars, and other events to help others to understand and use the data.”…(More)”
Applications of an Analytic Framework on Using Public Opinion Data for Solving Intelligence Problems
Report by the National Academies of Sciences, Engineering, and Medicine: “Measuring and analyzing public opinion comes with tremendous challenges, as evidenced by recent struggles to predict election outcomes and to anticipate mass mobilizations. The National Academies of Sciences, Engineering, and Medicine publication Measurement and Analysis of Public Opinion: An Analytic Framework presents in-depth information from experts on how to collect and glean insights from public opinion data, particularly in conditions where contextual issues call for applying caveats to those data. The Analytic Framework is designed specifically to help intelligence community analysts apply insights from the social and behavioral sciences on state-of-the-art approaches to analyze public attitudes in non- Western populations. Sponsored by the intelligence community, the National Academies’ Board on Behavioral, Cognitive, and Sensory Sciences hosted a 2-day hybrid workshop on March 8–9, 2022, to present the Analytic Framework and to demonstrate its application across a series of hypothetical scenarios that might arise for an intelligence analyst tasked with summarizing public attitudes to inform a policy decision. Workshop participants explored cutting-edge methods for using large-scale data as well as cultural and ethical considerations for the collection and use of public opinion data. This publication summarizes the presentations and discussions of the workshop…(More)”.
Why Funders Should Go Meta
Paper by Stuart Buck & Anna Harvey: “We don’t mean the former Facebook. Rather, philanthropies should prefer to fund meta-issues—i.e., research and evaluation, along with efforts to improve research quality. In many cases, it would be far more impactful than what they are doing now.
This is true at two levels.
First, suppose you want to support a certain cause–economic development in Africa, or criminal justice reform in the US, etc. You could spend millions or even billions on that cause.
But let’s go meta: a force multiplier would be funding high-quality research on what works on those issues. If you invest significantly in social and behavioral science research, you might find innumerable ways to improve on the existing status quo of donations.
Instead of only helping the existing nonprofits who seek to address economic development or criminal justice reform, you’d be helping to figure out what works and what doesn’t. The result could be a much better set of investments for all donors.
Perhaps some of your initial ideas end up not working, when exhaustively researched. At worst, that’s a temporary embarrassment, but it’s actually all for the better—now you and others know to avoid wasting more money on those ideas. Perhaps some of your favored policies are indeed good ideas (e.g., vaccination), but don’t have anywhere near enough take-up by the affected populations. Social and behavioral science research (as in the Social Science Research Council’s Mercury Project) could help find cost-effective ways to solve that problem…(More)”.
Building the analytic capacity to support critical technology strategy
Paper by Erica R.H. Fuchs: “Existing federal agencies relevant to the science and technology enterprise are appropriately focused on their missions, but the U.S. lacks the intellectual foundations, data infrastructure, and analytics to identify opportunities where the value of investment across missions (e.g., national security, economic prosperity, social well-being) is greater than the sum of its parts.
The U.S. government lacks systematic mechanisms to assess the nation’s strengths, weaknesses, and opportunities in technology and to assess the long chain of suppliers involved in producing products critical to national missions.
Two examples where modern data and analytics—leveraging star interdisciplinary talent from across the nation—and a cross-mission approach could transform outcomes include 1) the difficulties the federal government had in facilitating the production and distribution of personal protective equipment in spring 2020, and 2) the lack of clarity about the causes and solutions to the semiconductor shortage. Going forward, the scale-up of electric vehicles promises similar challenges…
The critical technology analytics (CTA) would identify 1) how emerging technologies and institutional innovations could potentially transform timely situational awareness of U.S. and global technology capabilities, 2) opportunities for innovation to transform U.S. domestic and international challenges, and 3) win-win opportunities across national missions. The program would be strategic and forward-looking, conducting work on a timeline of months and years rather than days and weeks, and would seek to generalize lessons from individual cases to inform the data and analytics capabilities that the government needs to build to support cross-mission critical technology policy…(More)”.
Lawless Surveillance
Paper by Barry Friedman: “Here in the United States, policing agencies are engaging in mass collection of personal data, building a vast architecture of surveillance. License plate readers collect our location information. Mobile forensics data terminals suck in the contents of cell phones during traffic stops. CCTV maps our movements. Cheap storage means most of this is kept for long periods of time—sometimes into perpetuity. Artificial intelligence makes searching and mining the data a snap. For most of us whose data is collected, stored, and mined, there is no suspicion whatsoever of wrongdoing.
This growing network of surveillance is almost entirely unregulated. It is, in short, lawless. The Fourth Amendment touches almost none of it, either because what is captured occurs in public, and so is supposedly “knowingly exposed,” or because of doctrine that shields information collected from third parties. It is unregulated by statutes because legislative bodies—when they even know about these surveillance systems—see little profit in taking on the police.
In the face of growing concern over such surveillance, this Article argues there is a constitutional solution sitting in plain view. In virtually every other instance in which personal information is collected by the government, courts require that a sound regulatory scheme be in place before information collection occurs. The rulings on the mandatory nature of regulation are remarkably similar, no matter under which clause of the Constitution collection is challenged.
This Article excavates this enormous body of precedent and applies it to the problem of government mass data collection. It argues that before the government can engage in such surveillance, there must be a regulatory scheme in place. And by changing the default rule from allowing police to collect absent legislative prohibition, to banning collection until there is legislative action, legislatures will be compelled to act (or there will be no surveillance). The Article defines what a minimally-acceptable regulatory scheme for mass data collection must include, and shows how it can be grounded in the Constitution…(More)”.
Citizens can effectively monitor the integrity of their elections: Evidence from Colombia
Paper by Natalia Garbiras-Díaz and Mateo Montenegro: “ICT-enabled monitoring tools effectively encourage citizens to oversee their elections and reduce fraud
Despite many efforts by governments and international organizations to guarantee free and fair elections, in many democracies, electoral integrity continues to be threatened. Irregularities including fraud, vote buying or voter intimidation reduce political accountability, which can distort the allocation of public goods and services (Hicken 2011, Khemani 2015).
But why is it so hard to prevent and curb electoral irregularities? While traditional strategies such as the deployment of electoral observers and auditors have proven effective (Hyde 2010, Enikolopov et al. 2013, Leefers and Vicente 2019), these are difficult to scale up and involve large investments in the training, security and transportation of personnel to remote and developing areas.
In Garbiras-Díaz and Montenegro (2022), we designed and implemented a large-scale field experiment during the election period in Colombia to study an innovative and light-touch strategy that circumvents many of these costs. We examine whether citizens can effectively oversee elections through online platforms, and demonstrate that delegating monitoring to citizens can provide a cost-effective alternative to more traditional strategies. Moreover, with growing access to the internet in developing countries reducing the barriers to online monitoring, this strategy is scalable and can be particularly impactful. Our results show how citizens can be encouraged to monitor elections, and, more importantly, illustrate how this form of monitoring can prevent politicians from using electoral irregularities to undermine the integrity of elections…(More)”.
All Democracy Is Global
Article by Larry Diamond: “The world is mired in a deep, diffuse, and protracted democratic recession. According to Freedom House, 2021 was the 16th consecutive year in which more countries declined in freedom than gained. Tunisia, the sole democracy to emerge from the Arab Spring protests that began in 2010, is morphing into a dictatorship. In countries as diverse as Bangladesh, Hungary, and Turkey, elections have long ceased to be democratic. Autocrats in Algeria, Belarus, Ethiopia, Sudan, Turkey, and Zimbabwe have clung to power despite mounting public demands for democratization. In Africa, seven democracies have slid back into autocracy since 2015, including Benin and Burkina Faso.
Democracy is looking shaky even in countries that hold free and fair elections. In emerging-market behemoths such as Brazil, India, and Mexico, democratic institutions and norms are under attack. Brazilian President Jair Bolsonaro has made threats of an autogolpe (self-coup) and a possible return to military rule if he does not win reelection in October. Indian Prime Minister Narendra Modi has steadily chipped away at press freedoms, minority rights, judicial independence, the integrity of the civil service, and the autonomy of civil society. Mexican President Andrés Manuel López Obrador has attempted to silence critics and remove democratic checks and balances.
Democratic prospects have risen and fallen in decades past, but they now confront a formidable new problem: democracy is at risk in the very country that has traditionally been its most ardent champion. Over the past dozen years, the United States has experienced one of the biggest declines in political rights and civil liberties of any country measured by the Freedom House annual survey. The Economist now ranks the United States as a “flawed democracy” behind Spain, Costa Rica, and Chile. U.S. President Donald Trump deserves much of the blame: he abused presidential power on a scale unprecedented in U.S. history and, after being voted out of office, propagated the “Big Lie” of election fraud and incited the violent rioters who stormed the U.S. Capitol on January 6, 2021. But American democracy was in peril before Trump assumed office, with rising polarization exposing acute flaws in American democratic institutions. The Electoral College, the representational structure of the Senate, the Senate filibuster, the brazen gerrymandering of House districts, and lifetime appointments to the Supreme Court have all made it possible for a political minority to exert prolonged outsize influence.
Can a country in the throes of its own democratic decay do anything to arrest the broader global decline? For many, the answer is no…(More)”.
Social capital: measurement and associations with economic mobility
Paper by Raj Chetty et al: “Social capital—the strength of an individual’s social network and community—has been identified as a potential determinant of outcomes ranging from education to health. However, efforts to understand what types of social capital matter for these outcomes have been hindered by a lack of social network data. Here, in the first of a pair of papers, we use data on 21 billion friendships from Facebook to study social capital. We measure and analyse three types of social capital by ZIP (postal) code in the United States: (1) connectedness between different types of people, such as those with low versus high socioeconomic status (SES); (2) social cohesion, such as the extent of cliques in friendship networks; and (3) civic engagement, such as rates of volunteering. These measures vary substantially across areas, but are not highly correlated with each other. We demonstrate the importance of distinguishing these forms of social capital by analysing their associations with economic mobility across areas. The share of high-SES friends among individuals with low SES—which we term economic connectedness—is among the strongest predictors of upward income mobility identified to date Other social capital measures are not strongly associated with economic mobility. If children with low-SES parents were to grow up in counties with economic connectedness comparable to that of the average child with high-SES parents, their incomes in adulthood would increase by 20% on average. Differences in economic connectedness can explain well-known relationships between upward income mobility and racial segregation, poverty rates, and inequality. To support further research and policy interventions, we publicly release privacy-protected statistics on social capital by ZIP code at https://www.socialcapital.org…(More)”.
Learning to Share: Lessons on Data-Sharing from Beyond Social Media
Paper by CDT: “What role has social media played in society? Did it influence the rise of Trumpism in the U.S. and the passage of Brexit in the UK? What about the way authoritarians exercise power in India or China? Has social media undermined teenage mental health? What about its role in building social and community capital, promoting economic development, and so on?
To answer these and other important policy-related questions, researchers such as academics, journalists, and others need access to data from social media companies. However, this data is generally not available to researchers outside of social media companies and, where it is available, it is often insufficient, meaning that we are left with incomplete answers.
Governments on both sides of the Atlantic have passed or proposed legislation to address the problem by requiring social media companies to provide certain data to vetted researchers (Vogus, 2022a). Researchers themselves have thought a lot about the problem, including the specific types of data that can further public interest research, how researchers should be vetted, and the mechanisms companies can use to provide data (Vogus, 2022b).
For their part, social media companies have sanctioned some methods to share data to certain types of researchers through APIs (e.g., for researchers with university affiliations) and with certain limitations (such as limits on how much and what types of data are available). In general, these efforts have been insufficient. In part, this is due to legitimate concerns such as the need to protect user privacy or to avoid revealing company trade secrets. But, in some cases, the lack of sharing is due to other factors such as lack of resources or knowledge about how to share data effectively or resistance to independent scrutiny.
The problem is complex but not intractable. In this report, we look to other industries where companies share data with researchers through different mechanisms while also addressing concerns around privacy. In doing so, our analysis contributes to current public and corporate discussions about how to safely and effectively share social media data with researchers. We review experiences based on the governance of clinical trials, electricity smart meters, and environmental impact data…(More)”