Reexamining Psychokinesis: Commentary on the Bösch, Steinkamp and Boller Meta-Analysis
📄 Original study ↗📌 Appears in:
Plain English Summary
When a major meta-analysis (a study pooling many experiments) examined 380 studies on psychokinesis — influencing physical systems with your mind — it found a real but messy pattern. This commentary, published in the prestigious Psychological Bulletin by researchers from Princeton and elsewhere, agreed there's a genuine signal but disagreed about what explains the variation. The original analysts assumed effect sizes shouldn't change with sample size, but this team argues that's wrong for PK experiments where motivation and feedback matter enormously. Strikingly, four huge studies contained 320 times more data than all others combined and showed a significant result the 'no effect' model can't explain. The authors also dismantled the claim that hundreds of negative studies hide in file drawers — a survey found only 59 missing studies, far fewer than the 1,544 predicted. They raised a philosophical puzzle too: how do you judge experiment quality when you don't already know the right answer?
Research Notes
Published in Psychological Bulletin (Vol. 132, No. 4, pp. 529-532) as a formal commentary in the same journal as the Bösch et al. meta-analysis. Authors from IONS (Radin), PEAR Lab at Princeton (Nelson, Dobyns), and Justus Liebig University (Houtkooper). The Radin/Bösch exchange in Psychological Bulletin is the most important published debate on the PK/RNG evidence base. Collins' experimenters' regress argument (catch-22 of judging experiment quality without knowing the correct outcome) is introduced as a fundamental epistemological point.
Responds to Bösch, Steinkamp & Boller's (2006) meta-analysis of 380 RNG psychokinesis studies. Agrees that existing data indicate a PK effect of high methodological quality with heterogeneous effect sizes, but disagrees about the source of heterogeneity. Argues that Bösch et al.'s core assumption — that effect size is independent of sample size — is incorrect for PK experiments where psychological context (motivation, feedback, bit rate) is the primary variable. Demonstrates that the four largest studies (4.54×10^11 bits) contain 320 times more data than all other studies combined and show z = -4.03, refuting the constant per-bit effect size model. Shows that the Monte Carlo file-drawer simulation has a built-in small-study effect and that an empirical survey of researchers found only ~59 unreported experiments, not the 1,544 the model predicts.
Links
Related Papers
Extends
Same Research Program
Companion
Also by these authors
More in Psychokinesis
Observer Influence on Quantum Interference: Testing the von Neumann-Wigner Consciousness-Collapse Theory
New Year's Eve as a Case Study in Experimental Metaphysics: Exploring Global Consciousness in Random Physical Systems
Anomalous Entropic Effects in Physical Systems Associated with Collective Consciousness
Psychophysical Interactions with Electrical Plasma: Three Exploratory Experiments
Psychophysical Effects on an Interference Pattern in a Double-Slit Optical System: An Exploratory Analysis of Variance
📋 Cite this paper
Radin, D, Nelson, R, Dobyns, Y, Houtkooper, J (2006). Reexamining Psychokinesis: Commentary on the Bösch, Steinkamp and Boller Meta-Analysis. Psychological Bulletin. https://doi.org/10.1037/0033-2909.132.4.529
@article{radin_2006_reexamining,
title = {Reexamining Psychokinesis: Commentary on the Bösch, Steinkamp and Boller Meta-Analysis},
author = {Radin, D and Nelson, R and Dobyns, Y and Houtkooper, J},
year = {2006},
journal = {Psychological Bulletin},
doi = {10.1037/0033-2909.132.4.529},
}