Skip to main content

Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?

πŸ“„ Original study
Pashler, Harold, Wagenmakers, Eric-Jan β€’ 2012 Modern Era β€’ methodology

πŸ“Œ Appears in:

Plain English Summary

This editorial kicked off one of the most dramatic self-reckonings in modern science. In 2011-2012, psychology hit a full-blown 'crisis of confidence,' and the editors pinpoint three things that lit the fuse. First, a prominent researcher named Stapel was caught fabricating data wholesale. Second, Daryl Bem published a study claiming to find evidence for ESP (precognition, specifically) in a top journal, which triggered both fascination and widespread ridicule. Third, researchers Simmons and colleagues showed that by using common but flexible data analysis tricks, you could make completely fake effects look statistically significant way more often than the supposed 5% false alarm rate. That last one was a gut punch: it meant the problem wasn't just fraud or fringe topics β€” ordinary, accepted research practices were quietly inflating false results across the board. By 2012, surveys revealed how widespread these questionable research practices really were, suspiciously too many published results clustered right at the magic threshold of statistical significance, and bitter public fights broke out over failed attempts to replicate famous findings. The special journal section introduced here gathers fifteen articles tackling both the diagnosis and potential cures β€” from replication attempts and Bayesian statistics (a different way of weighing evidence) to pre-registration (publicly declaring your analysis plan before collecting data) and open data sharing. What makes this piece especially fascinating is how a single ESP paper became a catalyst that forced all of psychology to look in the mirror.

Research Notes

Key editorial framing the replication crisis in psychology, explicitly citing Bem's precognition study as a primary catalyst alongside the Stapel fraud and Simmons et al.'s false-positive demonstration. Essential context for understanding why psi research provoked methodological self-examination across all of psychology. Directly relevant to Controversy #2 (Bem) and #10 (meta-debate on research soundness).

An editorial introduction to the Perspectives on Psychological Science special section on replicability, chronicling the 'crisis of confidence' that unfolded in 2011-2012. Three catalysts are identified: the Stapel fraud case, Bem's (2011) ESP publication followed by widespread mockery, and Simmons et al.'s (2011) demonstration that flexible data analysis produces false-positive rates far exceeding 5%. The editors note 2012 brought further evidence of the problem: QRP prevalence surveys, suspicious clustering of p-values just below .05, and acrimonious disputes over failed social priming replications. The special section's 15 articles span diagnosis and treatment, from replication failures to Bayesian reanalyses, pre-registration, and data sharing.

Links

Related Papers

Also by these authors

More in Methodology

πŸ“‹ Cite this paper
APA
Pashler, Harold, Wagenmakers, Eric-Jan (2012). Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?. Perspectives on Psychological Science. https://doi.org/10.1177/1745691612465253
BibTeX
@article{pashler_2012_replicability_crisis,
  title = {Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?},
  author = {Pashler, Harold and Wagenmakers, Eric-Jan},
  year = {2012},
  journal = {Perspectives on Psychological Science},
  doi = {10.1177/1745691612465253},
}