Estimating the Reproducibility of Psychological Science
π§ Skeptical/Critical βπ Appears in:
Plain English Summary
In a landmark wake-up call for science, 270 researchers teamed up to redo 100 psychology studies from top journals. The results were striking: while 97% of originals claimed significant findings, only 36% held up on retry. Effect sizes -- how strong a finding is -- were cut in half. Social psychology fared worst at just 25% replicating, versus 50% for cognitive psychology. The likely culprits? Publication bias (journals preferring exciting positive results) and flexible analysis that makes noise look like signal. This matters hugely for parapsychology debates, because critics single out psi research for failing to replicate -- yet mainstream psychology clearly has the same problem.
Research Notes
Landmark empirical foundation for the replication crisis in psychology. Directly relevant to psi debates: Rabeyron (2020) and Kennedy (2016, 2013) cite it to contextualize whether parapsychology's replication difficulties are unique or reflect discipline-wide problems. Establishes a 36% base rate for replication that many psi effect claims must be evaluated against.
A collaborative effort by 270 researchers replicated 100 experimental and correlational studies from three leading psychology journals (2008 issues of Psychological Science, JPSP, and JEP:LMC) using pre-registered, high-powered designs with original materials. While 97% of original studies reported significant results, only 36% of replications achieved significance. Mean replication effect sizes (r = 0.197) were half the original magnitudes (r = 0.403). Cognitive psychology findings replicated better (50%) than social psychology (25%). Replication success correlated with original evidence strength rather than replication team characteristics, implicating publication bias and analytic flexibility as likely contributors to inflated original effects.
Links
Related Papers
Companion
- Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling β John, Leslie K (2012)
- Why Most Research Findings About Psi Are False: The Replicability Crisis, the Psi Paradox and the Myth of Sisyphus β Rabeyron, Thomas (2020)
- Why Most Published Research Findings Are False β Ioannidis, John P.A (2005)
- False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant β Simmons, Joseph P (2011)
- The Garden of Forking Paths: Why Multiple Comparisons Can Be a Problem, Even When There Is No "Fishing Expedition" or "P-Hacking" and the Research Hypothesis Was Posited Ahead of Time β Gelman, Andrew (2013)
- Religious Priming: A Meta-Analysis With a Focus on Prosociality β Shariff, Azim F (2015)
- Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability β Nosek, Brian A (2012)
- The "File Drawer Problem" and Tolerance for Null Results β Rosenthal, Robert (1979)
Same Research Program
Cited By
- Is the Methodological Revolution in Psychology Over or Just Beginning? β Kennedy, J.E (2016)
- Can Parapsychology Move Beyond the Controversies of Retrospective Meta-Analyses? β Kennedy, J.E (2013)
- Conclusions about Paranormal Phenomena β Kennedy, J.E (2013)
- Searching for the Impossible: Parapsychology's Elusive Quest β Reber, Arthur S (2019)
- The Experimental Evidence for Parapsychological Phenomena: A Review β CardeΓ±a, Etzel (2018)
- Entertaining Without Endorsing: The Case for the Scientific Investigation of Anomalous Cognition β Schooler, Jonathan W (2018)
More in Methodology
Paranormal belief, conspiracy endorsement, and positive wellbeing: a network analysis
Planning Falsifiable Confirmatory Research
Addressing Researcher Fraud: Retrospective, Real-Time, and Preventive Strategies β Including Legal Points and Data Management That Prevents Fraud
Quantum Aspects of the Brain-Mind Relationship: A Hypothesis with Supporting Evidence
Paranormal beliefs and cognitive function: A systematic review and assessment of study quality across four decades of research
π Cite this paper
Open Science Collaboration (2015). Estimating the Reproducibility of Psychological Science. Science. https://doi.org/10.1126/science.aac4716
@article{open_science_2015_reproducibility,
title = {Estimating the Reproducibility of Psychological Science},
author = {Open Science Collaboration},
year = {2015},
journal = {Science},
doi = {10.1126/science.aac4716},
}