If Research Results Can’t Be Reproduced, Are They Still Valid?

A Coventry University academic has made a key contribution to the largest ever study into the ‘reproducibility’ of psychology research.

Dr Gavin Sullivan from the University’s Centre for Psychology, Behaviour and Achievement is one of 270 researchers around the world who investigated – as part of an Open Science Framework (OSF) project published in Science magazine – how easy it is to replicate the results from previous studies in the field of psychology.

The ability to reproduce a scientific result independently of the original study is regarded by many as the criterion for reliable scientific findings, so the aim of the OSF project was to explore ways in which psychologists can best establish credible knowledge for each other and the general public.

Launched nearly four years ago, the ‘Reproducibility Project: Psychology’ has produced the most comprehensive investigation ever done about the rate and predictors of reproducibility in a field of science.

The project conducted replications of 100 published findings in three prominent psychology journals. They found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study.

Dr Sullivan, who reproduced one of the 100 studies for the project, said:

“Members of the public regularly read news stories based on the results of published studies in psychology, and often consider whether to change what they know, believe in or do. What we looked at in this research is whether these studies can actually be reproduced, and therefore whether we can be confident in the credibility of studies in psychological science.

“Illustrating this with my own contribution to the replication project, I reproduced a study in which drawn representations of the emotions expressed by men and women of different ethnicities – Asian, African, Caucasian – were presented to people in order to judge the accuracy of their identification of one emotion in particular: pride.

“While the results indicated that people did identify expressions of pride with an accuracy better than chance – in accordance with the original study – a gender difference found in the original study was not reproduced. The result, therefore, had been partially – but arguably not completely – replicated.”

Mallory Kidwell, one of the project coordinators from the Center for Open Science, said:

“The results provide suggestive evidence toward the challenges of reproducing research findings, including identifying predictors of reproducibility and practices to improve it.”

So, what does this mean for the reliability of psychological research? The team emphasised that a failure to reproduce does not necessarily mean the original report was incorrect. However, a challenge for psychology and other fields is that incentives for scientists are not consistently aligned with reproducibility – research with new, surprising findings is more likely to be published than research examining when, why, or how existing findings can be reproduced. As a consequence, it is in many scientists’ career interests to pursue innovative research, even at the cost of reproducibility of the findings.

Three potential reasons for the study’s low reproducibility rate were proposed by the researcher:

1) Even though most replication teams worked with the original authors to use the same materials and methods, small differences in when, where, or how the replication was carried out might have influenced the results;
2) The replication might have failed to detect the original result by chance;
3) The original result might have been a false positive.

Johanna Cohoon, another project coordinator from the Center for Open Science, concluded that:

“The findings demonstrate that reproducing original results may be more difficult than is presently assumed, and interventions may be needed to improve reproducibility.

“In keeping with the goals of openness and reproducibility, every replication project posted its methods on a public website, and later added their raw data and computer code for reproducing their analyses.

“Since the reproducibility project began in 2011, similar projects have emerged in other fields such as the Reproducibility Project: Cancer Biology. Also, a discipline of metascience is emerging – scientific research about scientific research – to improve research transparency and reproducibility.”

For more information, you can view the full study online at ScienceMag.org.




Coventry University