Skip to content

I Predict A Riot

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Almost a year ago I posted a blog post titled ‘A Yale Professor’s One Man Rampage Against PloS, the Internet and a Belgian Research Group‘, covering the case of a respected researcher who became very upset about a study replicating his work and finding a negative outcome. This may have only been a foreshock, one of the first casualties of a new wave of replications that is beginning to shake the foundations of the field of psychology.


Psychology is historically a field in which replications are rare. Sought after journals with a high impact factor typically refuse to publish replications and until recently, there has been little to no incentive for researchers to conduct research which will not further their careers and likely would not even ever get read. Thus negative findings have vanished from existence while positive findings – which may be the exception rather than the rule, receive all the attention. This is a problem that has become known as the File Drawer Effect in the world of psychology. Ben Goldacre’s recent book Bad Pharmadiscusses the prevalence of the same problem in medicine – where the problem has extremely grave implications. I can’t recommend this book more highly, you can help fix the problem in medicine by signing a petition which already has 20,000 signatures, calling for all clinical trial data be published.

Back to psychology, a growing collective of researchers working on The Reproducibility Project, have been stepping forward to replicate studies in the field of social psychology. On Thursday The Chronicle published a fantastic article discussing the situation. The article appears to cite the headline of my blogpost describing the “one-man rampage” and includes an interview with the researcher involved – who has now taken down his offending blog post (sadly also consigning to the history books – or not as the case may be – the extensive discourse between researchers regarding the study, that appeared in the comments under the post). Another article in The Chronicle, published last year describes the background to the replication situation.

An extensive special issue of the journal Psychological Science titled a ‘Special Section on Replicability in Psychological Science: A Crisis of Confidence?‘ (still open access) describes the situation in academic detail, referring to the “acrimonious dust-up in science magazines and blogs” and asks “is there currently a crisis of confidence in psychological science reflecting an unprecedented level of doubt among practitioners about the reliability of research findings in the field?” (Pashler et al, 2012) [PDF]. Fellow anonymous blogger Neuroskeptic provides an enlightening humorous explanation of the problems currently affecting much of the world of science:

The nine circles of scientific hell (Neuroskeptic)

The special issue concludes with a report from John Ioannidis on “Why Science Is Not Necessarily Self-Correcting[PDFciting as an example, how the nonsense science of phrenology dominated neuropsychology in the 19th century. Ioannidis refers to the Library of Alexandria, the largest library of the ancient world, which was destroyed more than five times by Roman wars, Christian mobs and Arab conquests – and poses the startling question: Could it be possible that information equivalent in size to the library of Alexandria disappears every few minutes?

“Currently, there are petabytes of scientific information produced on a daily basis and millions of papers are being published annually. In most scientific fields, the vast majority of the collected data, protocols, and analyses are not available and/or disappear soon after or even before publication. If one tries to identify the raw data and protocols of papers published only 20 years ago, it is likely that very little is currently available. Even for papers published this week, readily available raw data, protocols, and analysis codes would be the exception rather than the rule. The large majority of currently published papers are mostly synoptic advertisements of the actual research. One cannot even try to reproduce the results based on what is available in the published word.”

According to Ioannidis, we are currently passing through an extraordinary age of perverse incentives in science. The prime motive of researchers is placed firmly on new discoveries and chasing statistical significance at all cost. Ioannidis describes a nightmare scenario, “Planet F345, Andromeda Galaxy, Year 3045268” in which the entire process of science is distorted by numerous perverse incentives placed on researchers by dictatorial journal publishers and financial officers “recruited after successful careers as real estate agents, managers in supermarket chains, or employees in other corporate structures where they have proven that they can cut cost and make more money”. Ioannidis proposes that if we do not change our ways, then planet F345 in the year 3045268 is where we are headed. Ioannidis goes as far as to speculate whether there is a “chance that wrong and inefficient medicine is currently becoming a major disaster for humans and human civilization”. Ioannidis suggests there is now “an excess of statistically significant results in the literature indicative of strong biases” with “the vast majority of analyses in psychological science fine tuned to obtain a desired result”. Could it really be the case that much of the science on which we base our medicine and our understanding of the mind is founded on inflated results based on a wild goose chase for statistical significance, and that these results could not be replicated? 

We will soon see the publication of the wave of replications which are being conducted by the coalition of researchers participating in the Open Science Framework’s Reproducibility Project and we will soon discover the proportion of papers – all taken from leading social psychology journals – that replicate successfully. In medicine Ben Goldacre’s campaign for the publication of clinical trial data has gained the support of the Royal Statistical Society, The British Library, PLoS, The British Medical Journal, Cochrane, The Medical Research Council, BioMed Central… the list goes on. If you ever plan to take advantage of evidence based medicine in your life – that means you – you’d do well to sign the petition.

If you are a researcher worried by these developments, you can cover your back by publicly determining your sample size, variables and conditions before starting your trial, properly detailing your methods so your work can be replicated accurately and publishing your trial data in open repositories such as the Open Science Framework and FigShare so your data can be independently analysed and potentially replicated; and please, please consider publishing your results in open access journals so the rest of the world can read your hard work. We may be approaching the dawn of a fantastic new open era of psychology, science and medicine but this is only likely to happen on a broad scale if institutions address the incentive structures researchers work under.

Learn more about the movement to replicate science:

In the lecture above from the Open Science Summit, Elizabeth Iorns speaks first about the Reproducibility Initiative – a project enabling researchers to request to have work replicated by independent researchers; and provides evidence of widespread unsuccessful replications of clinical trials. Next is a talk by Joanne Kamens on sharing biological samples. At 55 minutes Elizabeth Bartmess and Michael Cohn from the Reproducibility project describe the project described in this article to measure the reproducibility of research in social science and at 1 hour 16 minutes Jeff Spies describes the Open Science Framework, the infrastructure for posting raw research data online that is being used by the reproducibility project. If the video does not load above click here.

References:

Pashler, H., & Wagenmakers, E. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? Perspectives on Psychological Science, 7 (6), 528-530 DOI: 10.1177/1745691612465253 [PDF]

Neuroskeptic. (2012). The Nine Circles of Scientific Hell Perspectives on Psychological Science, 7 (6), 643-644 DOI:10.1177/1745691612459519 [PDF]

Ioannidis, J. (2012). Why Science Is Not Necessarily Self-CorrectingPerspectives on Psychological Science, 7 (6), 645-654 DOI:10.1177/1745691612464056 [PDF]

Image Credit: 1. The nine circles of scientific hell (Neuroskeptic). 2. Shutterstock/Vician. Video Credit: Open Science Summitt (October 19, 2012) Presentations by Elizabeth Bartmess, Michael Cohn, and Jeff Spies.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next