How Copyright May Destroy Our Access To The World’s Academic Knowledge


Article by Glyn Moody: “The shift from analogue to digital has had a massive impact on most aspects of life. One area where that shift has the potential for huge benefits is in the world of academic publishing. Academic papers are costly to publish and distribute on paper, but in a digital format they can be shared globally for almost no cost. That’s one of the driving forces behind the open access movement. But as Walled Culture has reported, resistance from the traditional publishing world has slowed the shift to open access, and undercut the benefits that could flow from it.

That in itself is bad news, but new research from Martin Paul Eve (available as open access) shows that the way the shift to digital has been managed by publishers brings with it a new problem. For all their flaws, analogue publications have the great virtue that they are durable: once a library has a copy, it is likely to be available for decades, if not centuries. Digital scholarly articles come with no such guarantee. The Internet is constantly in flux, with many publishers and sites closing down each year, often without notice. That’s a problem when sites holding archival copies of scholarly articles vanish, making it harder, perhaps impossible, to access important papers. Eve explored whether publishers were placing copies of the articles they published in key archives. Ideally, digital papers would be available in multiple archives to ensure resilience, but the reality is that very few publishers did this. Ars Technica has a good summary of Eve’s results:

When Eve broke down the results by publisher, less than 1 percent of the 204 publishers had put the majority of their content into multiple archives. (The cutoff was 75 percent of their content in three or more archives.) Fewer than 10 percent had put more than half their content in at least two archives. And a full third seemed to be doing no organized archiving at all.

At the individual publication level, under 60 percent were present in at least one archive, and over a quarter didn’t appear to be in any of the archives at all. (Another 14 percent were published too recently to have been archived or had incomplete records.)..(More)”.

EBP+: Integrating science into policy evaluation using Evidential Pluralism


Article by Joe Jones, Alexandra Trofimov, Michael Wilde & Jon Williamson: “…While the need to integrate scientific evidence in policymaking is clear, there isn’t a universally accepted framework for doing so in practice. Orthodox evidence-based approaches take Randomised Controlled Trials (RCTs) as the gold standard of evidence. Others argue that social policy issues require theory-based methods to understand the complexities of policy interventions. These divisions may only further decrease trust in science at this critical time.

EBP+ offers a broader framework within which both orthodox and theory-based methods can sit. EBP+ also provides a systematic account of how to integrate and evaluate these different types of evidence. EBP+ can offer consistency and objectivity in policy evaluation, and could yield a unified approach that increases public trust in scientifically-informed policy…

EBP+ is motivated by Evidential Pluralism, a philosophical theory of causal enquiry that has been developed over the last 15 years. Evidential Pluralism encompasses two key claims. The first, object pluralism, says that establishing that A is a cause of B (e.g., that a policy intervention causes a specific outcome) requires establishing both that and B are appropriately correlated and that there is some mechanism which links the two and which can account for the extent of the correlation. The second claim, study pluralism, maintains that assessing whether is a cause of B requires assessing both association studies (studies that repeatedly measure and B, together with potential confounders, to measure their association) and mechanistic studies (studies of features of the mechanisms linking A to B), where available…(More)”.

A diagrammatic representation of Evidential Pluralism
Evidential Pluralism (© Jon Williamson)

The Power of Noticing What Was Always There


Book by Tali Sharot and Cass R. Sunstein: “Have you ever noticed that what is thrilling on Monday tends to become boring on Friday? Even exciting relationships, stimulating jobs, and breathtaking works of art lose their sparkle after a while. People stop noticing what is most wonderful in their own lives. They also stop noticing what is terrible. They get used to dirty air. They stay in abusive relationships. People grow to accept authoritarianism and take foolish risks. They become unconcerned by their own misconduct, blind to inequality, and are more liable to believe misinformation than ever before.

But what if we could find a way to see everything anew? What if you could regain sensitivity, not only to the great things in your life, but also to the terrible things you stopped noticing and so don’t try to change?

Now, neuroscience professor Tali Sharot and Harvard law professor (and presidential advisor) Cass R. Sunstein investigate why we stop noticing both the great and not-so-great things around us and how to “dishabituate” at the office, in the bedroom, at the store, on social media, and in the voting booth. This groundbreaking work, based on decades of research in the psychological and biological sciences, illuminates how we can reignite the sparks of joy, innovate, and recognize where improvements urgently need to be made. The key to this disruption—to seeing, feeling, and noticing again—is change. By temporarily changing your environment, changing the rules, changing the people you interact with—or even just stepping back and imagining change—you regain sensitivity, allowing you to more clearly identify the bad and more deeply appreciate the good…(More)”.

A.I.-Generated Garbage Is Polluting Our Culture


Article by Eric Hoel: “Increasingly, mounds of synthetic A.I.-generated outputs drift across our feeds and our searches. The stakes go far beyond what’s on our screens. The entire culture is becoming affected by A.I.’s runoff, an insidious creep into our most important institutions.

Consider science. Right after the blockbuster release of GPT-4, the latest artificial intelligence model from OpenAI and one of the most advanced in existence, the language of scientific research began to mutate. Especially within the field of A.I. itself.

study published this month examined scientists’ peer reviews — researchers’ official pronouncements on others’ work that form the bedrock of scientific progress — across a number of high-profile and prestigious scientific conferences studying A.I. At one such conference, those peer reviews used the word “meticulous” more than 34 times as often as reviews did the previous year. Use of “commendable” was around 10 times as frequent, and “intricate,” 11 times. Other major conferences showed similar patterns.

Such phrasings are, of course, some of the favorite buzzwords of modern large language models like ChatGPT. In other words, significant numbers of researchers at A.I. conferences were caught handing their peer review of others’ work over to A.I. — or, at minimum, writing them with lots of A.I. assistance. And the closer to the deadline the submitted reviews were received, the more A.I. usage was found in them.

If this makes you uncomfortable — especially given A.I.’s current unreliability — or if you think that maybe it shouldn’t be A.I.s reviewing science but the scientists themselves, those feelings highlight the paradox at the core of this technology: It’s unclear what the ethical line is between scam and regular usage. Some A.I.-generated scams are easy to identify, like the medical journal paper featuring a cartoon rat sporting enormous genitalia. Many others are more insidious, like the mislabeled and hallucinated regulatory pathway described in that same paper — a paper that was peer reviewed as well (perhaps, one might speculate, by another A.I.?)…(More)”.

Citizen silence: Missed opportunities in citizen science


Paper by Damon M Hall et al: “Citizen science is personal. Participation is contingent on the citizens’ connection to a topic or to interpersonal relationships meaningful to them. But from the peer-reviewed literature, scientists appear to have an acquisitive data-centered relationship with citizens. This has spurred ethical and pragmatic criticisms of extractive relationships with citizen scientists. We suggest five practical steps to shift citizen-science research from extractive to relational, reorienting the research process and providing reciprocal benefits to researchers and citizen scientists. By virtue of their interests and experience within their local environments, citizen scientists have expertise that, if engaged, can improve research methods and product design decisions. To boost the value of scientific outputs to society and participants, citizen-science research teams should rethink how they engage and value volunteers…(More)”.

Bring on the Policy Entrepreneurs


Article by Erica Goldman: “Teaching early-career researchers the skills to engage in the policy arena could prepare them for a lifetime of high-impact engagement and invite new perspectives into the democratic process.

In the first six months of the COVID-19 pandemic, the scientific literature worldwide was flooded with research articles, letters, reviews, notes, and editorials related to the virus. One study estimates that a staggering 23,634 unique documents were published between January 1 and June 30, 2020, alone.

Making sense of that emerging science was an urgent challenge. As governments all over the world scrambled to get up-to-date guidelines to hospitals and information to an anxious public, Australia stood apart in its readiness to engage scientists and decisionmakers collaboratively. The country used what was called a “living evidence” approach to synthesizing new information, making it available—and helpful—in real time.

Each week during the pandemic, the Australian National COVID‑19 Clinical Evidence Taskforce came together to evaluate changes in the scientific literature base. They then spoke with a single voice to the Australian clinical community so clinicians had rapid, evidence-based, and nationally agreed-upon guidelines to provide the clarity they needed to care for people with COVID-19.

This new model for consensus-aligned, evidence-based decisionmaking helped Australia navigate the pandemic and build trust in the scientific enterprise, but it did not emerge overnight. It took years of iteration and effort to get the living evidence model ready to meet the moment; the crisis of the pandemic opened a policy window that living evidence was poised to surge through. Australia’s example led the World Health Organization and the United Kingdom’s National Institute for Health and Care Excellence to move toward making living evidence models a pillar of decisionmaking for all their health care guidelines. On its own, this is an incredible story, but it also reveals a tremendous amount about how policies get changed…(More)”.

Meta to shut off data access to journalists


Article by Sara Fischer: “Meta plans to officially shutter CrowdTangle, the analytics tool widely used by journalists and researchers to see what’s going viral on Facebook and Instagram, the company’s president of global affairs Nick Clegg told Axios in an interview.

Why it matters: The company plans to instead offer select researchers access to a set of new data tools, but news publishers, journalists or anyone with commercial interests will not be granted access to that data.

The big picture: The effort comes amid a broader pivot from Meta away from news and politics and more toward user-generated viral videos.

  • Meta acquired CrowdTangle in 2016 at a time when publishers were heavily reliant on the tech giant for traffic.
  • In recent years, it’s stopped investing in the tool, making it less reliable.

The new research tools include Meta’s Content Library, which it launched last year, and an API, or backend interface used by developers.

  • Both tools offer researchers access to huge swaths of data from publicly accessible content across Facebook and Instagram.
  • The tools are available in 180 languages and offer global data.
  • Researchers must apply for access to those tools through the Inter-university Consortium for Political and Social Research at the University of Michigan, which will vet their requests…(More)”

The Dark World of Citation Cartels


Article by Domingo Docampo: “In the complex landscape of modern academe, the maxim “publish or perish” has been gradually evolving into a different mantra: “Get cited or your career gets blighted.” Citations are the new academic currency, and careers now firmly depend on this form of scholarly recognition. In fact, citation has become so important that it has driven a novel form of trickery: stealth networks designed to manipulate citations. Researchers, driven by the imperative to secure academic impact, resort to forming citation rings: collaborative circles engineered to artificially boost the visibility of their work. In doing so, they compromise the integrity of academic discourse and undermine the foundation of scholarly pursuit. The story of the modern “citation cartel” is not just a result of publication pressure. The rise of the mega-journal also plays a role, as do predatory journals and institutional efforts to thrive in global academic rankings.

Over the past decade, the landscape of academic research has been significantly altered by the sheer number of scholars engaging in scientific endeavors. The number of scholars contributing to indexed publications in mathematics has doubled, for instance. In response to the heightened demand for space in scientific publications, a new breed of publishing entrepreneur has seized the opportunity, and the result is the rise of mega-journals that publish thousands of articles annually. Mathematics, an open-access journal produced by the Multidisciplinary Digital Publishing Institute, published more than 4,763 articles in 2023, making up 9.3 percent of all publications in the field, according to the Web of Science. It has an impact factor of 2.4 and an article-influence measure of just 0.37, but, crucially, it is indexed with Clarivate’s Web of Science, Elsevier’s Scopus, and other indexers, which means its citations count toward a variety of professional metrics. (By contrast, the Annals of Mathematics, published by Princeton University, contained 22 articles last year, and has an impact factor of 4.9 and an article-influence measure of 8.3.)..(More)”

A Plan to Develop Open Science’s Green Shoots into a Thriving Garden


Article by Greg Tananbaum, Chelle Gentemann, Kamran Naim, and Christopher Steven Marcum: “…As it’s moved from an abstract set of principles about access to research and data into the realm of real-world activities, the open science movement has mirrored some of the characteristics of the open source movement: distributed, independent, with loosely coordinated actions happening in different places at different levels. Globally, many things are happening, often disconnected, but still interrelated: open science has sowed a constellation of thriving green shoots, not quite yet a garden, but all growing rapidly on arable soil.

Streamlining research processes, reducing duplication of efforts, and accelerating scientific discoveries could ensure that the fruits of open science processes and products are more accessible and equitably distributed.

It is now time to consider how much faster and farther the open science movement could go with more coordination. What efficiencies might be realized if disparate efforts could better harmonize across geographies, disciplines, and sectors? How would an intentional, systems-level approach to aligning incentives, infrastructure, training, and other key components of a rationally functioning research ecosystem advance the wider goals of the movement? Streamlining research processes, reducing duplication of efforts, and accelerating scientific discoveries could ensure that the fruits of open science processes and products are more accessible and equitably distributed…(More)”

Societal challenges and big qualitative data require a new era of methodological pragmatism


Blog by Alex Gillespie, Vlad Glăveanu, and Constance de Saint-Laurent: “The ‘classic’ methods we use today in psychology and the social sciences might seem relatively fixed, but they are the product of collective responses to concerns within a historical context. The 20th century methods of questionnaires and interviews made sense in a world where researchers did not have access to what people did or said, and even if they did, could not analyse it at scale. Questionnaires and interviews were suited to 20th century concerns (shaped by colonialism, capitalism, and the ideological battles of the Cold War) for understanding, classifying, and mapping opinions and beliefs.

However, what social scientists are faced with today is different due to the culmination of two historical trends. The first has to do with the nature of the problems we face. Inequalities, the climate emergency and current wars are compounded by a general rise in nationalism, populism, and especially post-truth discourses and ideologies. Nationalism and populism are not new, but the scale and sophistication of misinformation threatens to undermine collective responses to collective problems.

It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’.

The second trend refers to technology and its accelerated development, especially the unprecedented accumulation of naturally occurring data (digital footprints) combined with increasingly powerful methods for data analysis (traditional and generative AI). It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’. The biggest datasets are unstructured qualitative data (each minute adds 2.5 million Google text searches, 500 thousand photos on Snapchat, 500 hours of YouTube videos) and the most significant AI advances leverage this qualitative data and make it tractable for social research.

These two trends have been fuelling the rise in mixed methods research…(More)” (See also their new book ‘Pragmatism and Methodology’ (open access)