Replicability in psychological research: a reflection
Abstract
Background: In recent years, psychological science has suffered a crisis of confidence that has been marked by the low rate of replicability demonstrated in collaborative projects that attempted to quantify this problem, evidencing the difficulty in making replications and the existence of a possible excess of false positives published in the scientific literature. Method: This opinion article aimed to review the panorama of the replicability crisis in psychology, as well as its possible causes. Conclusions: It began from the state of the replicability crisis, then some possible causes and their repercussions on the advancement of psychological science were highlighted, discussing various associated issues, such as individual biases on the part of researchers, the lack of incentives to replicability studies and the priority standards that journals would currently have for novel and positive studies. Finally, the existing alternatives to reverse this situation are mentioned, among them the opening to new statistical approaches, the restructuring of incentives and the development of editorial policies that facilitate the means for replication.
Downloads
References
Agnoli, F., Wicherts, J. M., Veldkamp, C. L. S., Albiero, P., & Cubelli, R. (2017). Questionable research practices among italian research psychologists. PLOS ONE, 12(3). https://doi.org/10.1371/journal.pone.0172792
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
Anvari, F., & Lakens, D. (2018). The replicability crisis and public trust in psychological science. Comprehensive Results in Social Psychology, 3(3), 266–286. https://doi.org/10.1080/23743603.2019.1684822
Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J. A., Fiedler, K., Fiedler, S., Funder, D. C., Kliegl, R., Nosek, B. A., Perugini, M., Roberts, B. W., Schmitt, M., van Aken, M. A. G., Weber, H., & Wicherts, J. M. (2013). Recommendations for Increasing Replicability in Psychology. European Journal of Personality, 27(2), 108–119. https://doi.org/10.1002/per.1919
Association for Psychological Science. (2013). Leading Psychological Science Journal Launches Initiative on Research Replication. Disponible en línea en: https://www.psychologicalscience.org/news/releases/initiative-on-research-replication.html (Consultado el 26 de junio de 2020).
Ato, M., López, J., & Benavente, A. (2013). Un sistema de clasificación de los diseños de investigación en psicología. Anales de Psicología, 29(3), 1038–1059. http://dx.doi.org/10.6018/analesps.29.3.178511
Bakker, M., van Dijk, A., & Wicherts, J. M. (2012). The Rules of the Game Called Psychological Science. Perspectives on Psychological Science, 7(6), 543–554. https://doi.org/10.1177/1745691612459060.
Bakker, M., & Wicherts, J. M. (2011). The (mis)reporting of statistical results in psychology journals. Behavior Research Methods, 43(3), 666–678. https://doi.org/10.3758/s13428-011-0089-5.
Blanco, F., Perales López, J. C., & Vadillo, M. A. (2018). Pot la psicologia rescatar-se a si mateixa?. Incentius, biaix i replicabilitat. Anuari de Psicologia de La Societat Valenciana de Psicologia, 18(2), 231–252. https://doi.org/10.7203/anuari.psicologia.18.2.231.
Brunner, J., & Schimmack, U. (2016). How replicable is psychology? A comparison of four methods of estimating replicability on the basis of test statistics in original studies. http://www.utstat.utoronto.ca/~brunner/zcurve2016/HowReplicable.pdf.
Cohen, J. (1994). The earth is round (p < .05). American Psychologist, 49(12), 997–1003. https://doi.org/10.1037/0003-066X.49.12.997.
Collins, F. S., & Tabak, L. A. (2014). Policy: NIH plans to enhance reproducibility. Nature, 505(7485), 612.
Culebras, J. M. (2016). Resultados negativos, cincuenta por ciento del conocimiento. Journal of Negative & No Positive Results, 1(1), 1–2. https://doi.org/10.19230/jonnpr.2016.1.1.926.
Cumming, G. (2014). The New Statistics. Psychological Science, 25(1), 7–29. https://doi.org/10.1177/0956797613504966.
Earp, B. D., & Trafimow, D. (2015). Replication, falsification, and the crisis of confidence in social psychology. Frontiers in Psychology, 6(621). https://doi.org/10.3389/fpsyg.2015.00621.
Fanelli, D. (2010). Do Pressures to Publish Increase Scientists’ Bias? An Empirical Support from US States Data. PLoS ONE, 5(4), e10271. https://doi.org/10.1371/journal.pone.0010271.
Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90(3), 891–904. https://doi.org/10.1007/s11192-011-0494-7.
Ferguson, C. J., & Heene, M. (2012). A Vast Graveyard of Undead Theories. Perspectives on Psychological Science, 7(6), 555–561. https://doi.org/10.1177/1745691612459059.
Fiedler, K., & Schwarz, N. (2016). Questionable Research Practices Revisited. Social Psychological and Personality Science, 7(1), 45–52. https://doi.org/10.1177/1948550615612150.
Gelman, A., & Loken, E. (2014). The Statistical Crisis in Science. American Scientist, 102(6), 460. https://doi.org/10.1511/2014.460.
Giner-Sorolla, R. (2012). Science or Art? How Aesthetic Standards Grease the Way Through the Publication Bottleneck but Undermine Science. Perspectives on Psychological Science, 7(6), 562–571. https://doi.org/10.1177/1745691612457576.
Higginson, A. D., & Munafò, M. R. (2016). Current Incentives for Scientists Lead to Underpowered Studies with Erroneous Conclusions. PLOS Biology, 14(11), e2000995. https://doi.org/10.1371/journal.pbio.2000995.
Ioannidis, John P.A., Tatsioni, A., & Karassa, F. B. (2010). A vision for the European Journal of Clinical Investigation: note from the new editors. European Journal of Clinical Investigation, 40(1), 1–3. https://doi.org/10.1111/j.1365-2362.2009.02229.x.
Ioannidis, John P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. https://doi.org/10.1371/journal.pmed.0020124.
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953.
Klein, R. A., Ratliff, K. A., Vianello, M., Adams Jr, R. B., Bahník, S., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Zeynep Cemalcilar, C. C. B., Chandler, J., Cheong, W., David, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., Hasselman, F., … Nosek, B. A. (2014). Investigating variation in replicability: A “many labs” replication project. Social Psychology, 45(3), 142–152. https://doi.org/10.1027/1864-9335/a000.
Koole, S. L., & Lakens, D. (2012). Rewarding Replications. Perspectives on Psychological Science, 7(6), 608–614. https://doi.org/10.1177/1745691612462586.
Lloret, S., Ferreres, A., Hernández, A., & Tomás, I. (2014). El análisis factorial exploratorio de los ítems: una guía práctica, revisada y actualizada. Anales de de Psicología, 30(3): 1151-1169. http://dx.doi.org/10.6018/analesps.30.3.199361.
Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in Psychology Research. Perspectives on Psychological Science, 7(6), 537–542. https://doi.org/10.1177/1745691612460688.
Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70(6), 487–498. https://doi.org/10.1037/a0039400.
Open Science Collaboration. (2012). An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science. Perspectives on Psychological Science, 7(6), 657–660. https://doi.org/10.1177/1745691612462588.
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716–aac4716. https://doi.org/10.1126/science.aac4716.
Ordoñez Morales, O. (2014). Replicar para comprender: prácticas investigativas para promover el razonamiento científico en estudiantes de psicología. Pensamiento Psicológico, 12(2). https://doi.org/10.11144/Javerianacali.PPSI12-2.rcpi.
Pashler, H., & Wagenmakers, E. (2012). Editors’ Introduction to the Special Section on Replicability in Psychological Science. Perspectives on Psychological Science, 7(6), 528–530. https://doi.org/10.1177/1745691612465253.
Patil, P., Peng, R. D., & Leek, J. T. (2016). What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science. Perspectives on Psychological Science, 11(4), 539–544. https://doi.org/10.1177/1745691616646366.
Protzko, J., & Schooler, J. W. (2020). No relationship between researcher impact and replication effect: an analysis of five studies with 100 replications. PeerJ, 8, e8014. https://doi.org/10.7717/peerj.8014.
Savalei, V., & Dunn, E. (2015). Is the call to abandon p-values the red herring of the replicability crisis? Frontiers in Psychology, 6, 245. https://doi.org/10.3389/fpsyg.2015.00245.
Schimmack, U. (2012). The ironic effect of significant results on the credibility of multiple-study articles. Psychological Methods, 17(4), 551–566. https://doi.org/10.1037/a0029487.
Schmidt, S. (2009). Shall we really Do It Again? The Powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13(2), 90–100.
Simera, I., Moher, D., Hirst, A., Hoey, J., Schulz, K. F., & Altman, D. G. (2010). Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network. BMC Medicine, 8(1), 24. https://doi.org/10.1186/1741-7015-8-24.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366.
Stevens, J. R. (2017). Replicability and Reproducibility in Comparative Psychology. Frontiers in Psychology, 8, 862. https://doi.org/10.3389/fpsyg.2017.00862.
Świątkowski, W., & Dompnier, B. (2017). Replicability Crisis in Social Psychology: Looking at the Past to Find New Pathways for the Future. International Review of Social Psychology, 30(1), 111. https://doi.org/10.5334/irsp.66.
Tackett, J. L., Lilienfeld, S. O., Patrick, C. J., Johnson, S. L., Krueger, R. F., Miller, J. D., Oltmanns, T. F., & Shrout, P. E. (2017). It’s Time to Broaden the Replicability Conversation: Thoughts for and From Clinical Psychological Science. Perspectives on Psychological Science, 12(5), 742–756. https://doi.org/10.1177/1745691617690042.
Tárraga López, P. J., & Rodríguez Montes, J. A. (2016). ¿Se deben publicar los resultados negativos o no positivos? Journal of Negative & No Positive Results, 1(2), 43–44. https://doi.org/10.19230/jonnpr.2016.1.2.928.
van Dijk, D., Manor, O., & Carey, L. B. (2014). Publication metrics and success on the academic job market. Current Biology, 24(11), R516–R517. https://doi.org/10.1016/j.cub.2014.04.039.
Wagenmakers, E.-J., Wetzels, R., Borsboom, D., & van der Maas, H. L. J. (2011). Why psychologists must change the way they analyze their data: The case of psi: Comment on Bem (2011). Journal of Personality and Social Psychology, 100(3), 426–432. https://doi.org/10.1037/a0022790.
Wetzels, R., Matzke, D., Lee, M. D., Rouder, J. N., Iverson, G. J., & Wagenmakers, E.-J. (2011). Statistical Evidence in Experimental Psychology. Perspectives on Psychological Science, 6(3), 291–298. https://doi.org/10.1177/1745691611406923.
Wicherts, J. M., Veldkamp, C. L. S., Augusteijn, H. E. M., Bakker, M., van Aert, R. C. M., & van Assen, M. A. L. M. (2016). Degrees of Freedom in Planning, Running, Analyzing, and Reporting Psychological Studies: A Checklist to Avoid p-Hacking. Frontiers in Psychology, 7, 1832. https://doi.org/10.3389/fpsyg.2016.01832.
Zúñiga Rosales, Y. (2019). Red EQUATOR: el uso de guías de reporte para garantizar una publicación de calidad. Revista Cubana de Genética Comunitaria, 11(1), 4–6. http://revgenetica.sld.cu/index.php/gen/article/view/23/28.
Copyright (c) 2020 Interacciones
This work is licensed under a Creative Commons Attribution 4.0 International License.
The authors retain the copyright and give the journal the right of the first publication and that they can edit it, reproduce it, distribute it, exhibit it and communicate it in the country and abroad through printed and digital media.
The digital version of the journal is registered under a Creative Commons license (Under Creative Commons License): Attribution 4.0 International (CC BY 4.0). Therefore, this work can be reproduced, distributed and publicly communicated in digital format, provided that the names of the authors and Interacciones.
Therefore, it is established that authors can make other independent and additional behavioural agreements for the non-exclusive distribution of the version of the article published in this journal (eg, include it in institutional repositories or publish it in a book) as long as it is clearly indicated that the work was published for the first time in this journal.