Book edited by Fred S. Roberts and Igor A. Sheremet: “The growth of a global digital economy has enabled rapid communication, instantaneous movement of funds, and availability of vast amounts of information. With this come challenges such as the vulnerability of digitalized sociotechnological systems (STSs) to destructive events (earthquakes, disease events, terrorist attacks). Similar issues arise for disruptions to complex linked natural and social systems (from changing climates, evolving urban environments, etc.). This book explores new approaches to the resilience of sociotechnological and natural-social systems in a digital world of big data, extraordinary computing capacity, and rapidly developing methods of Artificial Intelligence….
The world-wide COVID-19 pandemic illustrates the vulnerability of our healthcare systems, supply chains, and social infrastructure, and confronts our notions of what makes a system resilient. We have found that use of AI tools can lead to problems when unexpected events occur. On the other hand, the vast amounts of data available from sensors, satellite images, social media, etc. can also be used to make modern systems more resilient.
Papers in the book explore disruptions of complex networks and algorithms that minimize departure from a previous state after a disruption; introduce a multigrammatical framework for the technological and resource bases of today’s large-scale industrial systems and the transformations resulting from disruptive events; and explain how robotics can enhance pre-emptive measures or post-disaster responses to increase resiliency. Other papers explore current directions in data processing and handling and principles of FAIRness in data; how the availability of large amounts of data can aid in the development of resilient STSs and challenges to overcome in doing so. The book also addresses interactions between humans and built environments, focusing on how AI can inform today’s smart and connected buildings and make them resilient, and how AI tools can increase resilience to misinformation and its dissemination….(More)”.
Book edited by Ann Blair, Paul Duguid, Anja-Silvia Goeing, and Anthony Grafton: “Thanks to modern technological advances, we now enjoy seemingly unlimited access to information. Yet how did information become so central to our everyday lives, and how did its processing and storage make our data-driven era possible? This volume is the first to consider these questions in comprehensive detail, tracing the global emergence of information practices, technologies, and more, from the premodern era to the present. With entries spanning archivists to algorithms and scribes to surveilling, this is the ultimate reference on how information has shaped and been shaped by societies.
Written by an international team of experts, the book’s inspired and original long- and short-form contributions reconstruct the rise of human approaches to creating, managing, and sharing facts and knowledge. Thirteen full-length chapters discuss the role of information in pivotal epochs and regions, with chief emphasis on Europe and North America, but also substantive treatment of other parts of the world as well as current global interconnections. More than 100 alphabetical entries follow, focusing on specific tools, methods, and concepts—from ancient coins to the office memo, and censorship to plagiarism. The result is a wide-ranging, deeply immersive collection that will appeal to anyone drawn to the story behind our modern mania for an informed existence….(More)”.
Book by Kurth Cronin on “How Open Technological Innovation is Arming Tomorrow’s Terrorists…Never have so many possessed the means to be so lethal. The diffusion of modern technology (robotics, cyber weapons, 3-D printing, autonomous systems, and artificial intelligence) to ordinary people has given them access to weapons of mass violence previously monopolized by the state. In recent years, states have attempted to stem the flow of such weapons to individuals and non-state groups, but their efforts are failing.
As Audrey Kurth Cronin explains in Power to the People, what we are seeing now is an exacerbation of an age-old trend. Over the centuries, the most surprising developments in warfare have occurred because of advances in technologies combined with changes in who can use them. Indeed, accessible innovations in destructive force have long driven new patterns of political violence. When Nobel invented dynamite and Kalashnikov designed the AK-47, each inadvertently spurred terrorist and insurgent movements that killed millions and upended the international system.
That history illuminates our own situation, in which emerging technologies are altering society and redistributing power. The twenty-first century “sharing economy” has already disrupted every institution, including the armed forces. New “open” technologies are transforming access to the means of violence. Just as importantly, higher-order functions that previously had been exclusively under state military control – mass mobilization, force projection, and systems integration – are being harnessed by non-state actors. Cronin closes by focusing on how to respond so that we both preserve the benefits of emerging technologies yet reduce the risks. Power, in the form of lethal technology, is flowing to the people, but the same technologies that empower can imperil global security – unless we act strategically….(More)”.
Book by Dilip Soman and Catherine Yeung: “…This edited volume represents the first output from this international partnership. The book is designed to reflect our conceptual thinking, outline some early results from the partnership and an agenda for research and practice, and provide roadmaps to help both practitioners and academics converge in the common quest of developing behaviorally informed organizations. The book is divided into four parts.
In Part 1, “The Behaviorally Informed Organization,” four chapters lay out an agenda for what such an organization should be and could be. In chapter 1, Soman talks about the science of using behavioral science by developing a brief history of the field of behavioral science, outlining organizational realities, and generating a research agenda to help develop BIOrgs. In chapter 2, Feng and colleagues further develop an understanding of organizational realities and outline what resources and capabilities organizations need to develop in order to be truly behaviorally informed. In particular, they develop the notion of the cost of experimentation and make the point that driving down the cost of experimentation is key in developing behaviorally informed organizations. In chapter 3, Vinski asks and answers the question, “Why should organizations even want to be behaviorally informed?”; and in chapter 4, O’Malley and Peters add to that question by further addressing why organizations might actively resist the need to be behaviorally informed.
Organizational settings provide existing tools and also additional complexities, and in Part 2, “Overarching Insights and Tools,” four chapters address some of these organizational realities. Chapter 5 talks about “sludge” – small aspects of an organizationally created context that create impedance for end-users. If sludge is not cleared, the effectiveness of behavioral interventions will be constrained, and hence this chapter makes a case for identifying and eliminating sludge. In chapter 6, Duncan and colleagues provide a
guide to writing guidelines, an important tool for most policymakers and businesses as they attempt to provide helpful information to their citizens and customers. Given that organizations have multiple interactions for multiple products and services with their endusers, a binary classification into econs and humans is not feasible or helpful. Therefore, in chapter 7, Ireland talks about the boundedly rational complex consumer continuum, a nuanced framework for segmenting recipients of behavioral interventions. Given that endusers are inundated with information and other types of stimulus from organizations, it is unclear that they will attend to it all. In chapter 8, Hilchey and Taylor write about the psychology of attention and its implications for helping end-users make better decisions….(More)”.
Book edited by Arul Chib, Caitlin M. Bentley, and Matthew L. Smith: “Over the last ten years, “open” innovations—the sharing of information and communications resources without access restrictions or cost—have emerged within international development. But do these innovations empower poor and marginalized populations? This book examines whether, for whom, and under what circumstances the free, networked, public sharing of information and communication resources contribute (or not) toward a process of positive social transformation. The contributors offer cross-cutting theoretical frameworks and empirical analyses that cover a broad range of applications, emphasizing the underlying aspects of open innovations that are shared across contexts and domains.
The book first outlines theoretical frameworks that span knowledge stewardship, trust, situated learning, identity, participation, and power decentralization. It then investigates these frameworks across a range of institutional and country contexts, considering each in terms of the key emancipatory principles and structural impediments it seeks to address. Taken together, the chapters offer an empirically tested theoretical direction for the field….(More)”.
Book by Kate Crawford: “What happens when artificial intelligence saturates political life and depletes the planet? How is AI shaping our understanding of ourselves and our societies? In this book Kate Crawford reveals how this planetary network is fueling a shift toward undemocratic governance and increased inequality. Drawing on more than a decade of research, award-winning science, and technology, Crawford reveals how AI is a technology of extraction: from the energy and minerals needed to build and sustain its infrastructure, to the exploited workers behind “automated” services, to the data AI collects from us.
Rather than taking a narrow focus on code and algorithms, Crawford offers us a political and a material perspective on what it takes to make artificial intelligence and where it goes wrong. While technical systems present a veneer of objectivity, they are always systems of power. This is an urgent account of what is at stake as technology companies use artificial intelligence to reshape the world…(More)”.
Book by Aileen Nielsen: “Fairness is becoming a paramount consideration for data scientists. Mounting evidence indicates that the widespread deployment of machine learning and AI in business and government is reproducing the same biases we’re trying to fight in the real world. But what does fairness mean when it comes to code? This practical book covers basic concerns related to data security and privacy to help data and AI professionals use code that’s fair and free of bias.
Many realistic best practices are emerging at all steps along the data pipeline today, from data selection and preprocessing to closed model audits. Author Aileen Nielsen guides you through technical, legal, and ethical aspects of making code fair and secure, while highlighting up-to-date academic research and ongoing legal developments related to fairness and algorithms.
- Identify potential bias and discrimination in data science models
- Use preventive measures to minimize bias when developing data modeling pipelines
- Understand what data pipeline components implicate security and privacy concerns
- Write data processing and modeling code that implements best practices for fairness
- Recognize the complex interrelationships between fairness, privacy, and data security created by the use of machine learning models
- Apply normative and legal concepts relevant to evaluating the fairness of machine learning models…(More)”.
Open Access Book by Ken Steif: “… teaches readers how to address complex public policy problems with data and analytics using reproducible methods in R. Each of the eight chapters provides a detailed case study, showing readers: how to develop exploratory indicators; understand ‘spatial process’ and develop spatial analytics; how to develop ‘useful’ predictive analytics; how to convey these outputs to non-technical decision-makers through the medium of data visualization; and why, ultimately, data science and ‘Planning’ are one and the same. A graduate-level introduction to data science, this book will appeal to researchers and data scientists at the intersection of data analytics and public policy, as well as readers who wish to understand how algorithms will affect the future of government….(More)”.
Book edited by Daniel Moeckli, Anna Forgács, and Henri Ibi: “With the rise of direct-democratic instruments, the relationship between popular sovereignty and the rule of law is set to become one of the defining political issues of our time. This important and timely book provides an in-depth analysis of the limits imposed on referendums and citizens’ initiatives, as well as of systems of reviewing compliance with these limits, in 11 European states.
Chapters explore and lay the scientific basis for answering crucial questions such as ‘Where should the legal limits of direct democracy be drawn?’ and ‘Who should review compliance with these limits?’ Providing a comparative analysis of the different issues in the selected countries, the book draws out key similarities and differences, as well as an assessment of the law and the practice at national levels when judged against the international standards contained in the Venice Commission’s Guidelines on the Holding of Referendums.
Presenting an up-to-date analysis of the relationship between popular sovereignty and the rule of law, The Legal Limits of Direct Democracy will be a key resource for scholars and students in comparative and constitutional law and political science. It will also be beneficial to policy-makers and practitioners in parliaments, governments and election commissions, and experts working for international organisations….(More)”.
“A Guide for Scholars, Researchers, and Wonks” by Jonathan Schwabish: “Now more than ever, content must be visual if it is to travel far. Readers everywhere are overwhelmed with a flow of data, news, and text. Visuals can cut through the noise and make it easier for readers to recognize and recall information. Yet many researchers were never taught how to present their work visually.
This book details essential strategies to create more effective data visualizations. Jonathan Schwabish walks readers through the steps of creating better graphs and how to move beyond simple line, bar, and pie charts. Through more than five hundred examples, he demonstrates the do’s and don’ts of data visualization, the principles of visual perception, and how to make subjective style decisions around a chart’s design. Schwabish surveys more than eighty visualization types, from histograms to horizon charts, ridgeline plots to choropleth maps, and explains how each has its place in the visual toolkit. It might seem intimidating, but everyone can learn how to create compelling, effective data visualizations. This book will guide you as you define your audience and goals, choose the graph that best fits for your data, and clearly communicate your message….(More)”.