Law and Technology Realism


Paper by Thibault Schrepel: “One may identify two current trends in the field of “Law and Technology.” The first trend concerns technological determinism. Some argue that technology is deterministic: the state of technological advancement is the determining factor of society. Others oppose that view, claiming it is the society that affects technology. The second trend concerns technological neutrality. some say that technology is neutral, meaning the effects of technology depend entirely on the social context. Others defend the opposite: they view the effects of technology as being inevitable (regardless of the society in which it is used).

<p><em><strong>Figure 1</strong></em></p>
Figure 1

While it is commonly accepted that technology is deterministic, I am under the impression that a majority of “Law and Technology” scholars also believe that technology is non-neutral. It follows that, according to this dominant view, (1) technology drives society in good or bad directions (determinism), and that (2) certain uses of technology may lead to the reduction or enhancement of the common good (non-neutrality). Consequently, this leads to top-down tech policies where the regulator has the impossible burden of helping society control and orient technology to the best possible extent.

This article is deterministic and non-neutral.

But, here’s the catch. Most of today’s doctrine focuses almost exclusively on the negativity brought by technology (read Nick Bostrom, Frank Pasquale, Evgeny Morozov). Sure, these authors mention a few positive aspects, but still end up focusing on the negative ones. They’re asking to constrain technology on that sole basis. With this article, I want to raise another point: technology determinism can also drive society by providing solutions to centuries-old problems. In and of itself. This is not technological solutionism, as I am not arguing that technology can solve all of mankind’s problems, but it is not anti-solutionism either. I fear the extremes, anyway.

To make my point, I will discuss the issue addressed by Albert Hirschman in his famous book Exit, Voice, and Loyalty (Harvard University Press, 1970). Hirschman, at the time Professor of Economics at Harvard University, introduces the distinction between “exit” and “voice.” With exit, an individual exhibits her or his disagreement as a member of a group by leaving the group. With voice, the individual stays in the group but expresses her or his dissatisfaction in the hope of changing its functioning. Hirschman summarizes his theory on page 121, with the understanding that the optimal situation for any individual is to be capable of both “exit” and “voice“….(More)”.

Why hypothesis testers should spend less time testing hypotheses


Paper by Scheel, Anne M., Leonid Tiokhin, Peder M. Isager, and Daniel Lakens: “For almost half a century, Paul Meehl educated psychologists about how the mindless use of null-hypothesis significance tests made research on theories in the social sciences basically uninterpretable (Meehl, 1990). In response to the replication crisis, reforms in psychology have focused on formalising procedures for testing hypotheses. These reforms were necessary and impactful. However, as an unexpected consequence, psychologists have begun to realise that they may not be ready to test hypotheses. Forcing researchers to prematurely test hypotheses before they have established a sound ‘derivation chain’ between test and theory is counterproductive. Instead, various non-confirmatory research activities should be used to obtain the inputs necessary to make hypothesis tests informative.

Before testing hypotheses, researchers should spend more time forming concepts, developing valid measures, establishing the causal relationships between concepts and their functional form, and identifying boundary conditions and auxiliary assumptions. Providing these inputs should be recognised and incentivised as a crucial goal in and of itself.

In this article, we discuss how shifting the focus to non-confirmatory research can tie together many loose ends of psychology’s reform movement and help us lay the foundation to develop strong, testable theories, as Paul Meehl urged us to….(More)”

The Open Innovation in Science research field: a collaborative conceptualisation approach


Paper by Susanne Beck et al: “Openness and collaboration in scientific research are attracting increasing attention from scholars and practitioners alike. However, a common understanding of these phenomena is hindered by disciplinary boundaries and disconnected research streams. We link dispersed knowledge on Open Innovation, Open Science, and related concepts such as Responsible Research and Innovation by proposing a unifying Open Innovation in Science (OIS) Research Framework. This framework captures the antecedents, contingencies, and consequences of open and collaborative practices along the entire process of generating and disseminating scientific insights and translating them into innovation. Moreover, it elucidates individual-, team-, organisation-, field-, and society‐level factors shaping OIS practices. To conceptualise the framework, we employed a collaborative approach involving 47 scholars from multiple disciplines, highlighting both tensions and commonalities between existing approaches. The OIS Research Framework thus serves as a basis for future research, informs policy discussions, and provides guidance to scientists and practitioners….(More)”.

Building and maintaining trust in research


Daniel Nunan at the International Journal of Market Research: “One of the many indirect consequences of the COVID pandemic for the research sector may be the impact upon consumers’ willingness to share data. This is reflected in concerns that government mandated “apps” designed to facilitate COVID testing and tracking schemes will undermine trust in the commercial collection of personal data (WARC, 2020). For example, uncertainty over the consequences of handing over data and the ways in which it might be used could reduce consumers’ willingness to share data with organizations, and reverse a trend that has seen growing levels of consumer confidence in Data Protection Regulations (Data & Direct Marketing Association [DMA], 2020). This highlights how central the role of trust has become in contemporary research practice, and how fragile the process of building trust can be due to the ever competing demands of public and private data collectors.

For researchers, there are two sides to trust. One relates to building sufficient trust with research participants to be facilitate data collection, and the second is building trust with the users of research. Trust has long been understood as a key factor in effective research relationships, with trust between researchers and users of research the key factor in determining the extent to which research is actually used (Moorman et al., 1993). In other words, a trusted messenger is just as important as the contents of the message. In recent years, there has been growing concern over declining trust in research from research participants and the general public, manifested in declining response rates and challenges in gaining participation. Understanding how to build consumer trust is more important than ever, as the shift of communication and commercial activity to digital platforms alter the mechanisms through which trust is built. Trust is therefore essential both for ensuring that accurate data can be collected, and that insights from research have necessary legitimacy to be acted upon. The two research notes in this issue provide an insight into new areas where the issue of trust needs to be considered within research practice….(More)”.

The Open Innovation in Science research field: a collaborative conceptualisation approach


Paper by Susanne Beck et al: “Openness and collaboration in scientific research are attracting increasing attention from scholars and practitioners alike. However, a common understanding of these phenomena is hindered by disciplinary boundaries and disconnected research streams. We link dispersed knowledge on Open Innovation, Open Science, and related concepts such as Responsible Research and Innovation by proposing a unifying Open Innovation in Science (OIS) Research Framework. This framework captures the antecedents, contingencies, and consequences of open and collaborative practices along the entire process of generating and disseminating scientific insights and translating them into innovation. Moreover, it elucidates individual-, team-, organisation-, field-, and society‐level factors shaping OIS practices. To conceptualise the framework, we employed a collaborative approach involving 47 scholars from multiple disciplines, highlighting both tensions and commonalities between existing approaches. The OIS Research Framework thus serves as a basis for future research, informs policy discussions, and provides guidance to scientists and practitioners….(More)”.

Morphing Intelligence From IQ Measurement to Artificial Brains


Book by Catherine Malabou: “What is intelligence? The concept crosses and blurs the boundaries between natural and artificial, bridging the human brain and the cybernetic world of AI. In this book, the acclaimed philosopher Catherine Malabou ventures a new approach that emphasizes the intertwined, networked relationships among the biological, the technological, and the symbolic.

Malabou traces the modern metamorphoses of intelligence, seeking to understand how neurobiological and neurotechnological advances have transformed our view. She considers three crucial developments: the notion of intelligence as an empirical, genetically based quality measurable by standardized tests; the shift to the epigenetic paradigm, with its emphasis on neural plasticity; and the dawn of artificial intelligence, with its potential to simulate, replicate, and ultimately surpass the workings of the brain. Malabou concludes that a dialogue between human and cybernetic intelligence offers the best if not the only means to build a democratic future. A strikingly original exploration of our changing notions of intelligence and the human and their far-reaching philosophical and political implications, Morphing Intelligence is an essential analysis of the porous border between symbolic and biological life at a time when once-clear distinctions between mind and machine have become uncertain….(More)”.

Science Fictions: Exposing Fraud, Bias, Negligence and Hype in Science


Book by Stuart Ritchie: “So much relies on science. But what if science itself can’t be relied on?

Medicine, education, psychology, health, parenting – wherever it really matters, we look to science for guidance. Science Fictions reveals the disturbing flaws that undermine our understanding of all of these fields and more.

While the scientific method will always be our best and only way of knowing about the world, in reality the current system of funding and publishing science not only fails to safeguard against scientists’ inescapable biases and foibles, it actively encourages them. Many widely accepted and highly influential theories and claims – about ‘priming’ and ‘growth mindset’, sleep and nutrition, genes and the microbiome, as well as a host of drugs, allergies and therapies – turn out to be based on unreliable, exaggerated and even fraudulent papers. We can trace their influence in everything from austerity economics to the anti-vaccination movement, and occasionally count the cost of them in human lives….(More)”.

What science can do for democracy: a complexity science approach


Paper by Tina Eliassi-Rad et al: “Political scientists have conventionally assumed that achieving democracy is a one-way ratchet. Only very recently has the question of “democratic backsliding” attracted any research attention. We argue that democratic instability is best understood with tools from complexity science. The explanatory power of complexity science arises from several features of complex systems. Their relevance in the context of democracy is discussed. Several policy recommendations are offered to help (re)stabilize current systems of representative democracy…(More)”.

Data Journeys in the Sciences


Book edited by Sabina Leonelli and Niccolò Tempini: “This groundbreaking, open access volume analyses and compares data practices across several fields through the analysis of specific cases of data journeys. It brings together leading scholars in the philosophy, history and social studies of science to achieve two goals: tracking the travel of data across different spaces, times and domains of research practice; and documenting how such journeys affect the use of data as evidence and the knowledge being produced. 

The volume captures the opportunities, challenges and concerns involved in making data move from the sites in which they are originally produced to sites where they can be integrated with other data, analysed and re-used for a variety of purposes. The in-depth study of data journeys provides the necessary ground to examine disciplinary, geographical and historical differences and similarities in data management, processing and interpretation, thus identifying the key conditions of possibility for the widespread data sharing associated with Big and Open Data. 

The chapters are ordered in sections that broadly correspond to different stages of the journeys of data, from their generation to the legitimisation of their use for specific purposes. Additionally, the preface to the volume provides a variety of alternative “roadmaps” aimed to serve the different interests and entry points of readers; and the introduction provides a substantive overview of what data journeys can teach about the methods and epistemology of research….(More)”.

Social Research in Times of Big Data: The Challenges of New Data Worlds and the Need for a Sociology of Social Research


Paper by Rainer Diaz-Bone et al: “The phenomenon of big data does not only deeply affect current societies but also poses crucial challenges to social research. This article argues for moving towards a sociology of social research in order to characterize the new qualities of big data and its deficiencies. We draw on the neopragmatist approach of economics of convention (EC) as a conceptual basis for such a sociological perspective.

This framework suggests investigating processes of quantification in their interplay with orders of justifications and logics of evaluation. Methodological issues such as the question of the “quality of big data” must accordingly be discussed in their deep entanglement with epistemic values, institutional forms, and historical contexts and as necessarily implying political issues such as who controls and has access to data infrastructures. On this conceptual basis, the article uses the example of health to discuss the challenges of big data analysis for social research.

Phenomena such as the rise of new and massive privately owned data infrastructures, the economic valuation of huge amounts of connected data, or the movement of “quantified self” are presented as indications of a profound transformation compared to established forms of doing social research. Methodological and epistemological, but also institutional and political, strategies are presented to face the risk of being “outperformed” and “replaced” by big data analysis as they are already done in big US American and Chinese Internet enterprises. In conclusion, we argue that the sketched developments have important implications both for research practices and methods teaching in the era of big data…(More)”.