The Dark World of Citation Cartels


Article by Domingo Docampo: “In the complex landscape of modern academe, the maxim “publish or perish” has been gradually evolving into a different mantra: “Get cited or your career gets blighted.” Citations are the new academic currency, and careers now firmly depend on this form of scholarly recognition. In fact, citation has become so important that it has driven a novel form of trickery: stealth networks designed to manipulate citations. Researchers, driven by the imperative to secure academic impact, resort to forming citation rings: collaborative circles engineered to artificially boost the visibility of their work. In doing so, they compromise the integrity of academic discourse and undermine the foundation of scholarly pursuit. The story of the modern “citation cartel” is not just a result of publication pressure. The rise of the mega-journal also plays a role, as do predatory journals and institutional efforts to thrive in global academic rankings.

Over the past decade, the landscape of academic research has been significantly altered by the sheer number of scholars engaging in scientific endeavors. The number of scholars contributing to indexed publications in mathematics has doubled, for instance. In response to the heightened demand for space in scientific publications, a new breed of publishing entrepreneur has seized the opportunity, and the result is the rise of mega-journals that publish thousands of articles annually. Mathematics, an open-access journal produced by the Multidisciplinary Digital Publishing Institute, published more than 4,763 articles in 2023, making up 9.3 percent of all publications in the field, according to the Web of Science. It has an impact factor of 2.4 and an article-influence measure of just 0.37, but, crucially, it is indexed with Clarivate’s Web of Science, Elsevier’s Scopus, and other indexers, which means its citations count toward a variety of professional metrics. (By contrast, the Annals of Mathematics, published by Princeton University, contained 22 articles last year, and has an impact factor of 4.9 and an article-influence measure of 8.3.)..(More)”

A Plan to Develop Open Science’s Green Shoots into a Thriving Garden


Article by Greg Tananbaum, Chelle Gentemann, Kamran Naim, and Christopher Steven Marcum: “…As it’s moved from an abstract set of principles about access to research and data into the realm of real-world activities, the open science movement has mirrored some of the characteristics of the open source movement: distributed, independent, with loosely coordinated actions happening in different places at different levels. Globally, many things are happening, often disconnected, but still interrelated: open science has sowed a constellation of thriving green shoots, not quite yet a garden, but all growing rapidly on arable soil.

Streamlining research processes, reducing duplication of efforts, and accelerating scientific discoveries could ensure that the fruits of open science processes and products are more accessible and equitably distributed.

It is now time to consider how much faster and farther the open science movement could go with more coordination. What efficiencies might be realized if disparate efforts could better harmonize across geographies, disciplines, and sectors? How would an intentional, systems-level approach to aligning incentives, infrastructure, training, and other key components of a rationally functioning research ecosystem advance the wider goals of the movement? Streamlining research processes, reducing duplication of efforts, and accelerating scientific discoveries could ensure that the fruits of open science processes and products are more accessible and equitably distributed…(More)”

Societal challenges and big qualitative data require a new era of methodological pragmatism


Blog by Alex Gillespie, Vlad Glăveanu, and Constance de Saint-Laurent: “The ‘classic’ methods we use today in psychology and the social sciences might seem relatively fixed, but they are the product of collective responses to concerns within a historical context. The 20th century methods of questionnaires and interviews made sense in a world where researchers did not have access to what people did or said, and even if they did, could not analyse it at scale. Questionnaires and interviews were suited to 20th century concerns (shaped by colonialism, capitalism, and the ideological battles of the Cold War) for understanding, classifying, and mapping opinions and beliefs.

However, what social scientists are faced with today is different due to the culmination of two historical trends. The first has to do with the nature of the problems we face. Inequalities, the climate emergency and current wars are compounded by a general rise in nationalism, populism, and especially post-truth discourses and ideologies. Nationalism and populism are not new, but the scale and sophistication of misinformation threatens to undermine collective responses to collective problems.

It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’.

The second trend refers to technology and its accelerated development, especially the unprecedented accumulation of naturally occurring data (digital footprints) combined with increasingly powerful methods for data analysis (traditional and generative AI). It is often said that we live in the age of ‘big data’, but what is less often said is that this is in fact the age of ‘big qualitative data’. The biggest datasets are unstructured qualitative data (each minute adds 2.5 million Google text searches, 500 thousand photos on Snapchat, 500 hours of YouTube videos) and the most significant AI advances leverage this qualitative data and make it tractable for social research.

These two trends have been fuelling the rise in mixed methods research…(More)” (See also their new book ‘Pragmatism and Methodology’ (open access)

Why Do Universities Ignore Good Ideas?


Article by Jeffrey Funk: “Here is a recent assessment of 2023 Nobel Prize Winner Katalin Kariko:

“Eight current and former colleagues of Karikó told The Daily Pennsylvanian that — over the course of three decades — the university repeatedly shunned Karikó and her research, despite its groundbreaking potential.”

Another article claims that this occurred because she could not get the financial support to continue her research.

Why couldn’t she get financial support? “You’re more likely to get grants if you’re a tenured faculty member, but you’re more likely to get promoted to tenure if you get grants,” said Eric Feigl-Ding, an epidemiologist at the New England Complex Systems Institute and a former faculty member and researcher at Harvard Medical School. “There is a vicious cycle,” he says.

Interesting. So, the idea doesn’t matter. What matters to funding agencies is that you have previously obtained funding or are a tenured professor. Really? Are funding agencies this narrow-minded?

Mr. Feigl-Ding also said, “Universities also tend to look at how much a researcher publishes, or how widely covered by the media their work is, as opposed to how innovative the research is.” But why couldn’t Karikó get published?

Science magazine tells the story of her main paper with Drew Weismann in 2005. After being rejected by Nature within 24 hours: “It was similarly rejected by Science and by Cell, and the word incremental kept cropping up in the editorial staff comments.”

Incremental? There are more than two million papers published each year, and this research, for which Karikó and Weismann won a Nobel Prize, was deemed incremental? If it had been rejected for methods or for the contents being impossible to believe, I think most people could understand the rejection. But incremental?

Obviously, most of the two million papers published each year are really incremental. Yet one of the few papers that we can all agree was not incremental, gets rejected because it was deemed incremental.

Furthermore, this is happening in a system of science in which even Nature admits “disruptive science has declined,” few science-based technologies are being successfully commercialized, and Nature admits that it doesn’t understand why…(More)”.

A complexity science approach to law and governance


Introduction to a Special Issue by Pierpaolo Vivo, Daniel M. Katz and J. B. Ruhl: “The premise of this Special Issue is that legal systems are complex adaptive systems, and thus complexity science can be usefully applied to improve understanding of how legal systems operate, perform and change over time. The articles that follow take this proposition as a given and act on it using a variety of methods applied to a broad array of legal system attributes and contexts. Yet not too long ago some prominent legal scholars expressed scepticism that this field of study would produce more than broad generalizations, if even that. To orient readers unfamiliar with this field and its history, here we offer a brief background on how using complexity science to study legal systems has advanced from claims of ‘pseudoscience’ status to a widely adopted mainstream method. We then situate and summarize the articles.

The focus of complexity science is complex adaptive systems (CAS), systems ‘in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing and adaptation via learning or evolution’. It is important to distinguish CAS from systems that are merely complicated, such as a combustion engine, or complex but non-adaptive, such as a hurricane. A forest or coastal ecosystem, for example, is a complicated network of diverse physical and biological components, which, under no central rules of control, is highly adaptive over time…(More)”.

The Importance of Using Proper Research Citations to Encourage Trustworthy News Reporting


Article by Andy Tattersall: “…Understanding the often mysterious processes of how research is picked up and used across different sections of the media is therefore important. To do this we looked at a sample of research that included at least one author from the University of Sheffield that had been cited in either national or local media. We obtained the data from Altmetric.com to explore whether the news story included supporting information that linked readers to the research and those behind it. These were links to any of the authors, their institution, the journal or the research funder. We also investigated how much of this research was available via open access.

National news websites were more likely to include a link to the research paper underpinning the news story.

The contrasts between national and local samples were notable. National news websites were more likely to include a link to the research paper underpinning the news story. National research coverage stories were also more organic. They were more likely to be original texts written by journalists who are credited as authors. This is reflected in more idiosyncratic citation practices. Guardian writers, such as Henry Nicholls and George Monbiot, regularly provided a proper academic citation to the research at the end of their articles. This should be standard practice, but it does require those writing press releases to include formatted citations with a link as a basic first step. 

Local news coverage followed a different pattern, which is likely due to their use of news agencies to provide stories. Much local news coverage relies on copying and pasting subscription content provided by the UK’s national news agency, PA News. Anyone who has visited their local news website in recent years will know that they are full of pop-ups and hyperlinks to adverts and commercial websites. As a result of this business model, local news stories contain no or very few links to the research and those behind the work. Whether any of this practice and the lack of information stems from academic institution and publisher press releases is debatable. 

“Much local news coverage relies on copying and pasting subscription content provided by the UK’s national news agency, PA News.

Further, we found that local coverage of research is often syndicated across multiple news sites, belonging to a few publishers. Consequently if a syndication republishes the same information across their news platforms, it replicates bad practice. A solution to this is to include a readily formatted citation with a link, preferably to an open access version, at the foot of the story. This allows local media to continue linking to third party sites whilst providing an option to explore the actual research paper, especially if that paper is open access…(More)”.

Research Project Management and Leadership


Book by P. Alison Paprica: “The project management approaches, which are used by millions of people internationally, are often too detailed or constraining to be applied to research. In this handbook, project management expert P. Alison Paprica presents guidance specifically developed to help with the planning, management, and leadership of research.

Research Project Management and Leadership provides simplified versions of globally utilized project management tools, such as the work breakdown structure to visualize scope, and offers guidance on processes, including a five-step process to identify and respond to risks. The complementary leadership guidance in the handbook is presented in the form of interview write-ups with 19 Canadian and international research leaders, each of whom describes a situation where leadership skills were important, how they responded, and what they learned. The accessible language and practical guidance in the handbook make it a valuable resource for everyone from principal investigators leading multimillion-dollar projects to graduate students planning their thesis research. The book aims to help readers understand which management and leadership tools, processes, and practices are helpful in different circumstances, and how to implement them in research settings…(More)”.

How tracking animal movement may save the planet


Article by Matthew Ponsford: “Researchers have been dreaming of an Internet of Animals. They’re getting closer to monitoring 100,000 creatures—and revealing hidden facets of our shared world….There was something strange about the way the sharks were moving between the islands of the Bahamas.

Tiger sharks tend to hug the shoreline, explains marine biologist Austin Gallagher, but when he began tagging the 1,000-pound animals with satellite transmitters in 2016, he discovered that these predators turned away from it, toward two ancient underwater hills made of sand and coral fragments that stretch out 300 miles toward Cuba. They were spending a lot of time “crisscrossing, making highly tortuous, convoluted movements” to be near them, Gallagher says. 

It wasn’t immediately clear what attracted sharks to the area: while satellite images clearly showed the subsea terrain, they didn’t pick up anything out of the ordinary. It was only when Gallagher and his colleagues attached 360-degree cameras to the animals that they were able to confirm what they were so drawn to: vast, previously unseen seagrass meadows—a biodiverse habitat that offered a smorgasbord of prey.   

The discovery did more than solve a minor mystery of animal behavior. Using the data they gathered from the sharks, the researchers were able to map an expanse of seagrass stretching across 93,000 square kilometers of Caribbean seabed—extending the total known global seagrass coverage by more than 40%, according to a study Gallagher’s team published in 2022. This revelation could have huge implications for efforts to protect threatened marine ecosystems—seagrass meadows are a nursery for one-fifth of key fish stocks and habitats for endangered marine species—and also for all of us above the waves, as seagrasses can capture carbon up to 35 times faster than tropical rainforests. 

Animals have long been able to offer unique insights about the natural world around us, acting as organic sensors picking up phenomena that remain invisible to humans. More than 100 years ago, leeches signaled storms ahead by slithering out of the water; canaries warned of looming catastrophe in coal mines until the 1980s; and mollusks that close when exposed to toxic substances are still used to trigger alarms in municipal water systems in Minneapolis and Poland…(More)”.

Language Machinery


Essay by Richard Hughes Gibson: “… current debates about writing machines are not as fresh as they seem. As is quietly acknowledged in the footnotes of scientific papers, much of the intellectual infrastructure of today’s advances was laid decades ago. In the 1940s, the mathematician Claude Shannon demonstrated that language use could be both described by statistics and imitated with statistics, whether those statistics were in human heads or a machine’s memory. Shannon, in other words, was the first statistical language modeler, which makes ChatGPT and its ilk his distant brainchildren. Shannon never tried to build such a machine, but some astute early readers of his work recognized that computers were primed to translate his paper-and-ink experiments into a powerful new medium. In writings now discussed largely in niche scholarly and computing circles, these readers imagined—and even made preliminary sketches of—machines that would translate Shannon’s proposals into reality. These readers likewise raised questions about the meaning of such machines’ outputs and wondered what the machines revealed about our capacity to write.

The current barrage of commentary has largely neglected this backstory, and our discussions suffer for forgetting that issues that appear novel to us belong to the mid-twentieth century. Shannon and his first readers were the original residents of the headspace in which so many of us now find ourselves. Their ambitions and insights have left traces on our discourse, just as their silences and uncertainties haunt our exchanges. If writing machines constitute a “philosophical event” or a “prompt for philosophizing,” then I submit that we are already living in the event’s aftermath, which is to say, in Shannon’s aftermath. Amid the rampant speculation about a future dominated by writing machines, I propose that we turn in the other direction to listen to field reports from some of the first people to consider what it meant to read and write in Shannon’s world…(More)”.

Toward a 21st Century National Data Infrastructure: Managing Privacy and Confidentiality Risks with Blended Data


Report by the National Academies of Sciences, Engineering, and Medicine: “Protecting privacy and ensuring confidentiality in data is a critical component of modernizing our national data infrastructure. The use of blended data – combining previously collected data sources – presents new considerations for responsible data stewardship. Toward a 21st Century National Data Infrastructure: Managing Privacy and Confidentiality Risks with Blended Data provides a framework for managing disclosure risks that accounts for the unique attributes of blended data and poses a series of questions to guide considered decision-making.

Technical approaches to manage disclosure risk have advanced. Recent federal legislation, regulation and guidance has described broadly the roles and responsibilities for stewardship of blended data. The report, drawing from the panel review of both technical and policy approaches, addresses these emerging opportunities and the new challenges and responsibilities they present. The report underscores that trade-offs in disclosure risks, disclosure harms, and data usefulness are unavoidable and are central considerations when planning data-release strategies, particularly for blended data…(More)”.