Replication

The Replication Crisis in Management & Organizational Sciences

For decades scholars have discussed how the lack of replication studies impedes scientific progress in the field of management (Starbuck, 2006). More recently, related empirical investigations have cast additional doubts whether research published by top management journals will replicate (Goldfarb & King, 2016; Open Science Collaboration, 2015). In response, top management journals have started to encourage the submission of replication studies (Bettis et al., 2016). In response, more replication studies were conducted and published (SI SMJ) but the large gap between the number of replication studies needed to arrive at evidence-based theories remains. 

How Can We Encourage More Replication Studies?

We believe replication studies are currently rarely conducted for two underlying reasons: (1) remaining strong career incentive to focus on developing new theory and (2) lack of experience on how to design and execute replication studies. Doctoral programs can address both constraints by (a) requiring first and second-year doctoral students to conduct a replication study and (b) by providing instructions on how to conduct high-quality replications.

ARIM provides a platform to coordinate related activities across participating universities. It provides resources on how to teach replication to doctoral students and best practices on how to implement replication studies. In addition, ARIM will support the publication of student replication studies and incentives in the form of replication study competitions and awards. In the end, ARIM will contribute to overcoming the replication crisis in management research by empowering an increasing number of management scholars to conduct replication studies and by establishing an increasing stream of replication studies in the management literature.

Further Reading

Bonett, D. G. (2021). Design and analysis of replication studies. Organizational Research Methods, 24(3), 513-529. https://doi.org/10.1177/1094428120911088 

Maula, M., & Stam, W. (2019). Enhancing Rigor in Quantitative Entrepreneurship Research. Entrepreneurship Theory and Practice, 44(6), 1059–1090. https://doi.org/10.1177/1042258719891388 

Köhler, T., & Cortina, J. M. (2019). Play It Again, Sam! An Analysis of Constructive Replication in the Organizational Sciences. Journal of Management, 47(2), 488–518. https://doi.org/10.1177/0149206319843985

Aguinis, H., & Solarino, A. M. (2019). Transparency and replicability in qualitative research: The case of interviews with elite informants. Strategic Management Journal. https://doi.org/10.1002/smj.3015 

Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716–aac4716. https://doi.org/10.1126/science.aac4716

Collins, F. S., & Tabak, L. A. (2014). Policy: NIH plans to enhance reproducibility. Nature, 505(7485), 612–613. https://doi.org/10.1038/505612a

Landis, R. S., & Rogelberg, S. G. (2013). Our Scholarly Practices Are Derailing Our Progress: The Importance of “Nothing” in the Organizational Sciences. Industrial and Organizational Psychology, 6(3), 299–302. https://doi.org/10.1111/iops.12054

Franzen, M. (2011). Making Science News: The Press Relations of Scientific Journals and Implications for Scholarly Communication. In Sociology of the Sciences Yearbook (pp. 333–352). Springer Netherlands. https://doi.org/10.1007/978-94-007-2085-5_17

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. https://doi.org/10.1371/journal.pmed.0020124