Explore our articles
View All Results
Share:

More self-reflection in research can lead to better science

Editorial by Nature: “The durability of research findings can be cast in terms of three Rs. Findings should be reproducible (the same type of analysis using the same data should produce the same result); replicable (redoing an experiment to collect fresh data should produce the same result); and robust (alternative analyses using the same data should draw the same conclusion).

Over the past two decades, studies in fields from psychology to medicine have highlighted that these criteria are often not met, leading to talk of a crisis in replication and reproducibility. Four papers published this week in Nature look at the reproducibility, replicability and robustness of research in the social and behavioural sciences. They provide a snapshot of the analysed fields, and suggest factors that could make research findings more likely to endure. Researchers, funders, journals and institutions should take note — for the betterment of all science.

Three of the papers are an outcome of nearly US$8 million in funding provided in 2019 by the US Defense Advanced Research Projects Agency to the Systematizing Confidence in Open Research and Evidence (SCORE) programme. The project is run by the Center for Open Science, a non-profit organization in Washington DC. More than 850 researchers contributed to hundreds of duplication efforts, establishing a database of reliability markers for 3,900 papers published between 2009 and 2018 (see go.nature.com/4campyc). The fourth paper is the result of a series of one-day ‘replication games’ workshops organized around the world since 2022 by the Institute for Replication, a virtual, non-profit network.

Some of the results are sobering. For example, Tyner et al.find that statistically significant effects could be replicated for only about half of the 164 papers they studied. Moreover, the replicated effect sizes were on average less than half of what was originally reported. This ‘decline effect’ has been reported before, but it is unclear how much is due to authors’ cognitive biases, questionable research practices, the preference of journals for eye-catching results, flukes or true effects that are specific to a particular population and time…(More)”.

Share
How to contribute:

Did you come across – or create – a compelling project/report/book/app at the leading edge of innovation in governance?

Share it with us at info@thelivinglib.org so that we can add it to the Collection!

About the Curator

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday

Related articles

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday