Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty

Research output: Contribution to journalJournal articleResearchpeer-review

Documents

  • Fulltext

    Final published version, 3.36 MB, PDF document

  • Nate Breznau
  • Eike Mark Rinke
  • Alexander Wuttke
  • Hung H. V. Nguyen
  • Muna Adem
  • Jule Adriaans
  • Amalia Alvarez-Benjumea
  • Henrik K. Andersen
  • Daniel Auer
  • Flavio Azevedo
  • Oke Bahnsen
  • Dave Balzer
  • Gerrit Bauer
  • Paul C. Bauer
  • Markus Baumann
  • Sharon Baute
  • Verena Benoit
  • Julian Bernauer
  • Carl Berning
  • Anna Berthold
  • Felix S. Bethke
  • Thomas Biegert
  • Katharina Blinzler
  • Johannes N. Blumenberg
  • Licia Bobzien
  • Andrea Bohman
  • Thijs Bol
  • Amie Bostic
  • Zuzanna Brzozowska
  • Katharina Burgdorf
  • Kaspar Burger
  • Kathrin B. Busch
  • Juan Carlos-Castillo
  • Nathan Chan
  • Pablo Christmann
  • Roxanne Connelly
  • Christian S. Czymara
  • Elena Damian
  • Alejandro Ecker
  • Achim Edelmann
  • Maureen A. Eger
  • Simon Ellerbrock
  • Anna Forke
  • Andrea Forster
  • Chris Gaasendam
  • Konstantin Gavras
  • Vernon Gayle
  • Theresa Gessler
  • Merhout, Friedolin
  • Schaeffer, Merlin
  • The Crowdsourced Replication Inititative

This study explores how researchers' analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers' expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team's workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers' results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.

Original languageEnglish
Article number2203150119
JournalProceedings of the National Academy of Sciences of the United States of America
Volume119
Issue number44
Number of pages8
ISSN0027-8424
DOIs
Publication statusPublished - 2022

    Research areas

  • metascience, many analysts, researcher degrees of freedom, analytical flexibility, immigration and policy preferences, WELFARE-STATE, IMMIGRATION, SUPPORT, REDISTRIBUTION, PREFERENCES, ANALYSTS, IDEAS

ID: 332564184