Skip to main content

Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability

πŸ“„ Original study β†—
Nosek, Brian A, Spies, Jeffrey R, Motyl, Matt β€’ 2012 Modern Era β€’ methodology

Plain English Summary

Here's a wake-up call for science itself. The authors tried to replicate a splashy psychology finding and -- with nearly 1,300 participants -- got absolutely nothing. That personal experience fueled a deep dive into why science keeps producing results that don't hold up. The culprit? A publish-or-perish culture that rewards exciting, positive findings and buries boring-but-honest null results. They catalogue nine sneaky tricks (called questionable research practices) that researchers use, often without realizing it, like peeking at data early and stopping when it looks good, or only reporting the analyses that worked. Their fix is a bold transparency overhaul: share your data openly, pre-register your study plans before collecting data so you can't move the goalposts, and build systems that actually reward replication (re-running studies to verify them). This paper became a rallying cry for the open-science movement and remains the yardstick against which modern parapsychology research is judged.

Research Notes

A foundational open-science manifesto that directly informs the library's meta-debate controversy (#10). Its QRP taxonomy provides a checklist for evaluating every empirical psi paper in the collection, and its reform proposals (pre-registration, open data) are the standards against which modern parapsychology studies are increasingly judged.

Publication norms in academic science emphasize novel, positive results, creating incentives that inflate false-positive rates and discourage replication. Drawing on the authors' own failed replication of a provocative embodied-cognition finding (original p = .01, N = 1,979; replication p = .59, N = 1,300), the paper catalogues nine common practices that increase publishability at the expense of accuracy, including optional stopping, selective reporting, HARKing, and avoidance of direct replication. Existing remedies (negative-results journals, education campaigns, reviewer vigilance) are judged insufficient. The proposed solutions restructure incentives around open data, open methods, open workflow with pre-registration, post-publication review, and Replication Value metrics, arguing that transparency and accountability can make the abstract accuracy motive competitive with the concrete publication motive.

Links

Related Papers

Also by these authors

More in Methodology

πŸ“‹ Cite this paper
APA
Nosek, Brian A, Spies, Jeffrey R, Motyl, Matt (2012). Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability. Perspectives on Psychological Science. https://doi.org/10.1177/1745691612459058
BibTeX
@article{nosek_spies_motyl_2013_scientific_utopia,
  title = {Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability},
  author = {Nosek, Brian A and Spies, Jeffrey R and Motyl, Matt},
  year = {2012},
  journal = {Perspectives on Psychological Science},
  doi = {10.1177/1745691612459058},
}