Data Analysis for Social Science: A Friendly and Practical Introduction


Book by Elena Llaudet and Kosuke Imai: “…provides a friendly introduction to the statistical concepts and programming skills needed to conduct and evaluate social scientific studies. Using plain language and assuming no prior knowledge of statistics and coding, the book provides a step-by-step guide to analyzing real-world data with the statistical program R for the purpose of answering a wide range of substantive social science questions. It teaches not only how to perform the analyses but also how to interpret results and identify strengths and limitations. This one-of-a-kind textbook includes supplemental materials to accommodate students with minimal knowledge of math and clearly identifies sections with more advanced material so that readers can skip them if they so choose.

  • Analyzes real-world data using the powerful, open-sourced statistical program R, which is free for everyone to use
  • Teaches how to measure, predict, and explain quantities of interest based on data
  • Shows how to infer population characteristics using survey research, predict outcomes using linear models, and estimate causal effects with and without randomized experiments
  • Assumes no prior knowledge of statistics or coding
  • Specifically designed to accommodate students with a variety of math backgrounds
  • Provides cheatsheets of statistical concepts and R code
  • Supporting materials available online, including real-world datasets and the code to analyze them, plus—for instructor use—sample syllabi, sample lecture slides, additional datasets, and additional exercises with solutions…(More)”.

Universal Access and Its Asymmetries


Book by Harmeet Sawhney and Hamid R. Ekbia: “Universal access—the idea that certain technologies and services should be extended to all regardless of geography or ability to pay—evokes ideals of democracy and equality that must be reconciled with the realities on the ground. The COVID-19 pandemic raised awareness of the need for access to high-speed internet service in the United States, but this is just the latest in a long history of debates about what should be made available and to whom. Rural mail delivery, electrification, telephone service, public schooling, and library access each raised the same questions as today’s debates about health care and broadband. What types of services should be universally available? Who benefits from extending these services? And who bears the cost?

Stepping beyond humanitarian arguments to conduct a clear-eyed, diagnostic analysis, this book offers some surprising conclusions. While the conventional approach to universal access looks primarily at the costs to the system and the benefits to individuals, Harmeet Sawhney and Hamid Ekbia provide a holistic perspective that also accounts for costs to individuals and benefits for systems. With a comparative approach across multiple cases, Universal Access and Its Asymmetries is an essential exploration of the history, costs, and benefits of providing universal access to technologies and services. With a fresh perspective, it overturns common assumptions and offers a foundation for making decisions about how to extend service—and how to pay for it…(More)”.

The Power of Partnership in Open Government


Book by by Suzanne J. Piotrowski, Daniel Berliner and Alex Ingrams: “At the 2011 meeting of the UN General Assembly, the governments of eight nations—Brazil, Indonesia, Mexico, Norway, Philippines, South Africa, the United Kingdom, and the United States—launched the Open Government Partnership, a multilateral initiative aimed at promoting transparency, empowering citizens, fighting corruption, and harnessing new technologies to strengthen governance. At the time, many were concerned that the Open Government Partnership would end up toothless, offering only lip service to vague ideals and misguided cyber-optimism. The Power of Partnership in Open Government offers a close look, and a surprising affirmation, of the Open Government Partnership as an example of a successful transnational multistakeholder initiative that has indeed impacted policy and helped to produce progressive reform.

By 2019 the Open Government Partnership had grown to 78 member countries and 20 subnational governments. Through a variety of methods—document analysis, interviews, process tracing, and quantitative analysis of secondary data—Suzanne J. Piotrowski, Daniel Berliner, and Alex Ingrams chart the Open Government Partnership’s effectiveness and evaluate what this reveals about the potential of international reform initiatives in general. Their work calls upon scholars and policymakers to reconsider the role of international institutions and, in doing so, to differentiate between direct and indirect pathways to transnational impact on domestic policy. The more nuanced and complex processes of the indirect pathway, they suggest, have considerable but often overlooked potential to shape policy norms and models, alter resources and opportunities, and forge new linkages and coalitions—in short, to drive the substantial changes that inspire initiatives like the Open Government Partnership…(More)”.

The Dangers of Systems Illiteracy


Review by Carol Dumaine: “In 1918, as the Great War was coming to an end after four bloody years of brutal conflict, an influenza pandemic began to ravage societies around the globe. While in Paris negotiating the terms of the peace agreement in the spring of 1919, evidence indicates that US president Woodrow Wilson was stricken with the flu. 

Wilson, who had been intransigent in insisting on just peace terms for the defeated nations (what he called “peace without victory”), underwent a profound change of mental state that his personal physician and closest advisors attributed to his illness. While sick, Wilson suddenly agreed to all the terms he had previously adamantly rejected and approved a treaty that made onerous demands of Germany. 

Wilson’s reversal left Germans embittered and his own advisors disillusioned. Historian John M. Barry, who recounts this episode in his book about the 1918 pandemic, The Great Influenza, observes that most historians agree “that the harshness toward Germany of the Paris peace treaty helped create the economic hardship, nationalistic reaction, and political chaos that fostered the rise of Hitler.” 

This anecdote is a vivid illustration of how a public health disaster can intersect with world affairs, potentially sowing the seeds for a future of war. Converging crises can leave societies with too little time to regroup, breaking down resilience and capacities for governance. Barry concludes from his research into the 1918 pandemic that to forestall this loss of authority—and perhaps to avoid future, unforeseen repercussions—government leaders should share the unvarnished facts and evolving knowledge of a situation. 

Society is ultimately based on trust; during the flu pandemic, “as trust broke down, people became alienated not only from those in authority, but from each other.” Barry continues, “Those in authority must retain the public’s trust. The way to do that is to distort nothing, to put the best face on nothing, to try to manipulate no one.”

Charles Weiss makes a similar argument in his new book, The Survival Nexus: Science, Technology, and World Affairs. Weiss contends that the preventable human and economic losses of the COVID-19 pandemic were the result of politicians avoiding harsh truths: “Political leaders suppressed evidence of virus spread, downplayed the importance of the epidemic and the need to observe measures to protect the health of the population, ignored the opinions of local experts, and publicized bogus ‘cures’—all to avoid economic damage and public panic, but equally importantly to consolidate political power and to show themselves as strong leaders who were firmly in control.” …(More)”.

Language and the Rise of the Algorithm


Book by Jeffrey M. Binder: “Bringing together the histories of mathematics, computer science, and linguistic thought, Language and the Rise of the Algorithm reveals how recent developments in artificial intelligence are reopening an issue that troubled mathematicians well before the computer age: How do you draw the line between computational rules and the complexities of making systems comprehensible to people? By attending to this question, we come to see that the modern idea of the algorithm is implicated in a long history of attempts to maintain a disciplinary boundary separating technical knowledge from the languages people speak day to day.
 
Here Jeffrey M. Binder offers a compelling tour of four visions of universal computation that addressed this issue in very different ways: G. W. Leibniz’s calculus ratiocinator; a universal algebra scheme Nicolas de Condorcet designed during the French Revolution; George Boole’s nineteenth-century logic system; and the early programming language ALGOL, short for algorithmic language. These episodes show that symbolic computation has repeatedly become entangled in debates about the nature of communication. Machine learning, in its increasing dependence on words, erodes the line between technical and everyday language, revealing the urgent stakes underlying this boundary.
 
The idea of the algorithm is a levee holding back the social complexity of language, and it is about to break. This book is about the flood that inspired its construction…(More)”.

Research Methods in Deliberative Democracy


Book edited by Selen A. Ercan et al: “… brings together a wide range of methods used in the study of deliberative democracy. It offers thirty-one different methods that scholars use for theorizing, measuring, exploring, or applying deliberative democracy. Each chapter presents one method by explaining its utility in deliberative democracy research and providing guidance on its application by drawing on examples from previous studies. The book hopes to inspire scholars to undertake methodologically robust, intellectually creative, and politically relevant research. It fills a significant gap in a rapidly growing field of research by assembling diverse methods and thereby expanding the range of methodological choices available to students, scholars, and practitioners of deliberative democracy…(More)”.

Beyond Measure: The Hidden History of Measurement from Cubits to Quantum Constants


Book by James Vincent: “From the cubit to the kilogram, the humble inch to the speed of light, measurement is a powerful tool that humans invented to make sense of the world. In this revelatory work of science and social history, James Vincent dives into its hidden world, taking readers from ancient Egypt, where measuring the annual depth of the Nile was an essential task, to the intellectual origins of the metric system in the French Revolution, and from the surprisingly animated rivalry between metric and imperial, to our current age of the “quantified self.” At every turn, Vincent is keenly attuned to the political consequences of measurement, exploring how it has also been used as a tool for oppression and control.

Beyond Measure reveals how measurement is not only deeply entwined with our experience of the world, but also how its history encompasses and shapes the human quest for knowledge…(More)”.

Science in Negotiation


Book by Jessica Espey on “The Role of Scientific Evidence in Shaping the United Nations Sustainable Development Goals, 2012-2015”: “This book explores the role of scientific evidence within United Nations (UN) deliberation by examining the negotiation of the Sustainable Development Goals (SDGs), endorsed by Member States in 2015. Using the SDGs as a case study, this book addresses a key gap in our understanding of the role of evidence in contemporary international policy-making. It is structured around three overarching questions: (1) how does scientific evidence influence multilateral policy development within the UN General Assembly? (2) how did evidence shape the goals and targets that constitute the SDGs?; and (3) how did institutional arrangements and non-state actor engagements mediate the evidence-to-policy process in the development of the SDGs? The ultimate intention is to tease out lessons on global policy-making and to understand the influence of different evidence inputs and institutional factors in shaping outcomes.

To understand the value afforded to scientific evidence within multilateral deliberation, a conceptual framework is provided drawing upon literature from policy studies and political science, including recent theories of evidence-informed policy-making and new institutionalism. It posits that the success or failure of evidence informing global political processes rests upon the representation and access of scientific stakeholders, levels of community organisation, the framing and presentation of evidence, and time, including the duration over which evidence and key conceptual ideas are presented. Cutting across the discussion is the fundamental question of whose evidence counts and how expertise is defined? The framework is tested with specific reference to three themes that were prominent during the SDG negotiation process; public health (articulated in SDG 3), urban sustainability (articulated in SDG 11), and data and information systems (which were a cross-cutting theme of the dialogue). Within each, scientific communities had specific demands and through an exploration of key literature, including evidence inputs and UN documentation, as well as through key informant interviews, the translation of these scientific ideas into policy priorities is uncovered…(More)”.

Data Analysis for Social Science: A Friendly and Practical Introduction


Book by Elena Llaudet and Kosuke Imai: “…provides a friendly introduction to the statistical concepts and programming skills needed to conduct and evaluate social scientific studies. Using plain language and assuming no prior knowledge of statistics and coding, the book provides a step-by-step guide to analyzing real-world data with the statistical program R for the purpose of answering a wide range of substantive social science questions. It teaches not only how to perform the analyses but also how to interpret results and identify strengths and limitations. This one-of-a-kind textbook includes supplemental materials to accommodate students with minimal knowledge of math and clearly identifies sections with more advanced material so that readers can skip them if they so choose…(More)”.

Digital Oil


Book by Eric Monteiro: “Digitalization sits at the forefront of public and academic conversation today, calling into question how we work and how we know. In Digital Oil, Eric Monteiro uses the Norwegian offshore oil and gas industry as a lens to investigate the effects of digitalization on embodied labor and, in doing so, shows how our use of new digital technology transforms work and knowing.

For years, roughnecks have performed the dangerous and unwieldy work of extracting the oil that lies three miles below the seabed along the Norwegian continental shelf. Today, the Norwegian oil industry is largely digital, operated by sensors and driven by data. Digital representations of physical processes inform work practices and decision-making with remotely operated, unmanned deep-sea facilities. Drawing on two decades of in-depth interviews, observations, news clips, and studies of this industry, Monteiro dismantles the divide between the virtual and the physical in Digital Oil.

What is gained or lost when objects and processes become algorithmic phenomena with the digital inferred from the physical? How can data-driven work practices and operational decision-making approximate qualitative interpretation, professional judgement, and evaluation? How are emergent digital platforms and infrastructures, as machineries of knowing, enabling digitalization? In answering these questions Monteiro offers a novel analysis of digitalization as an effort to press the limits of quantification of the qualitative…(More)”.