Statistical Problems in ESP Research
🧐 Skeptical/Critical📌 Appears in:
Plain English Summary
Back in 1978, Harvard statistician Persi Diaconis -- who also happened to be a former professional magician -- took a hard look at decades of ESP research and found it riddled with problems. His unique combo of statistical chops and stage-magic know-how let him spot both number-crunching tricks and physical ways information might leak to subjects. He nailed four big culprits: researchers stopping data collection the moment results looked good; testing tons of conditions but only reporting the hits; using "random" card sequences with detectable patterns clever guessers could exploit; and plain old sensory leakage through subtle visual, sound, or smell cues. The kicker? When experiments actually plugged these holes, the impressive ESP effects shrank or vanished. This hugely influential paper directly shaped the famous Hyman-Honorton debate and pushed the field toward pre-registration, computer randomization, and double-blind designs -- reforms echoed in today's broader reproducibility crisis across all of science.
Research Notes
Foundational skeptical paper by a Harvard statistician who was also a former professional magician — dual expertise that gave the critique unusual specificity regarding both statistical artifacts and physical mechanisms for sensory leakage. One of the most influential criticisms of Rhine-era ESP research. Directly cited in debates that produced the Hyman-Honorton Joint Communiqué (1986). Pairs with kennedy_1981_skepticism_negative_results and the modern reproducibility-crisis literature (Simmons 2011, Ioannidis 2005, Button 2013).
Landmark statistical critique of ESP research by Harvard statistician Persi Diaconis, published in Science (Vol. 201, No. 4351, pp. 131–136). Analyzes four classes of methodological problems that generate spurious positive results: (1) optional stopping — analyzing data repeatedly and halting collection when significance is reached, without a pre-specified stopping rule; (2) multiple testing — examining many subjects, conditions, and measures then reporting only significant outcomes; (3) inadequate randomization — pseudo-random target sequences in Rhine-era experiments had detectable statistical regularities that subjects' guessing strategies exploited; and (4) sensory leakage — insufficient physical isolation in card-guessing paradigms provided olfactory, visual, and auditory cues. Concludes that better-controlled experiments consistently yield weaker or null results, and that statistical analysis alone cannot validate ESP claims without rigorous experimental design. Directly motivated methodological reforms including pre-registration, automated randomization, and double-blind protocols.
Links
Related Papers
Same Research Program
- Skepticism and Negative Results in Borderline Areas of Science — Kennedy, J.E (1981)
- Testing for Questionable Research Practices in a Meta-Analysis: An Example from Experimental Parapsychology — Bierman, Dick J (2016)
- Power failure: why small sample size undermines the reliability of neuroscience — Button, Katherine S (2013)
- A Joint Communiqué: The Psi Ganzfeld Controversy — Hyman, Ray (1986)
- Remote Viewing Revisited: Well-Controlled Experiments Don't Find the "RV Effect" — Marks, David F (1982)
Companion
More in Skeptical
Cognitive Styles and Psi: Psi Researchers Are More Similar to Skeptics Than to Lay Believers
Searching for the Impossible: Parapsychology's Elusive Quest
False-Positive Effect in the Radin Double-Slit Experiment on Observer Consciousness as Determined with the Advanced Meta-Experimental Protocol
Cross-Examining the Case for Precognition: Comment on Mossbridge and Radin (2018)
N,N-Dimethyltryptamine and the Pineal Gland: Separating Fact from Myth
📋 Cite this paper
Diaconis, Persi (1978). Statistical Problems in ESP Research. Science. https://doi.org/10.1126/science.201.4351.131
@article{diaconis_1978_statistical_esp,
title = {Statistical Problems in ESP Research},
author = {Diaconis, Persi},
year = {1978},
journal = {Science},
doi = {10.1126/science.201.4351.131},
}