瓜子TV鈥檚 Carmel Levitan is a member of an international team of scholars who co-authored an ambitious investigation into the reproducibility of psychological research published today in .
Launched nearly four years ago and coordinated by the Center for Open Science, the Reproducibility Project: Psychology has produced the most comprehensive investigation ever conducted about the rate and predictors of reproducibility in a field of science.
The project by 270 scientists replicated research featured in 100 studies published in three prominent psychology journals. Two of the study replications were done by Levitan, associate professor of cognitive science, and her students -- one by students in her research methods class and the other by research assistants in her lab. Project researchers found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study.
A failure to reproduce original results doesn鈥檛 necessarily mean the original report was incorrect, Levitan said. "In no cases did we detect fraud. Science is by its nature difficult to do because we don't already know the answers. I expect that with the current move towards open science, a similar project in 10 years would have greater success in replicating results."
"It was incredibly exciting to be part of such a big project with collaborations with so many scientists from all over the world," said Annemarie Schnedler 鈥16, a cognitive science major from Palo Alto who collected data for both 瓜子TV replication studies. "No matter how small of a role I played I still I felt like I was making a difference in the scientific community."
Reproducibility 鈥 the reoccurrence of research results when the same data are analyzed again, or when new data are collected using the same methods 鈥 is a cornerstone of modern science. "Scientific evidence does not rely on trusting the authority of the person that made the discovery," says Angela Attwood, a project member from University of Bristol. "Rather, credibility accumulates through independent replication and elaboration of the ideas and evidence."
Yet a problem for psychology and other fields is that incentives for scientists are not consistently aligned with reproducibility. "What is good for science and what is good for scientists are not always the same thing. In the present culture, scientists鈥 key incentive is earning publications of their research, particularly in prestigious outlets," said Ljiljana Lazarevi膰, project member from the University of Belgrade.
Research with new, surprising findings is more likely to be published than research examining when, why, or how existing findings can be reproduced. As a consequence, it is in many scientists鈥 career interests to pursue innovative research, even at the cost of reproducibility of the findings.
In addition to its findings, this study illustrates how a discipline of metascience is emerging -- scientific research about scientific research. These and the widespread efforts to improve research transparency and reproducibility are indications that "Science is actively self-examining and self-correcting to maximize the quality and efficiency of the research process in the service of building knowledge for the public good," said project member Susann Fiedler of the Max Planck Institute for Research on Collective Goods.