IDEAS home Printed from https://ideas.repec.org/p/feb/framed/00646.html
   My bibliography  Save this paper

To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study

Author

Listed:
  • John List
  • Zacharias Maniadis
  • Fabio Tufano

Abstract

The sciences are in an era o fan alleged "credibility crisis'. In this study, we discuss the reproducibility of empirical results, focusing on economics research. By combining theory and empirical evidence, we discuss the import of replication studies, and whether they improve our confidence in novel findings. The theory sheds light on the importance of replications, even when replications are subject to bias. We then present a pilot meta-study of replication in experimental economics, a subfield serving as a positive benchmark for investigating the credibility of economics. Our meta-study highlights certain difficulties when applying meta-research (Ioannidis et al., 2015) and systematizing the economics literature.

Suggested Citation

  • John List & Zacharias Maniadis & Fabio Tufano, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Framed Field Experiments 00646, The Field Experiments Website.
  • Handle: RePEc:feb:framed:00646
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00646.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Stefano DellaVigna & John A. List & Ulrike Malmendier, 2012. "Testing for Altruism and Social Pressure in Charitable Giving," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(1), pages 1-56.
    2. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. Andrew J. Oswald, 2007. "An Examination of the Reliability of Prestigious Scholarly Journals: Evidence and Implications for Decision‐Makers," Economica, London School of Economics and Political Science, vol. 74(293), pages 21-31, February.
    5. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    6. Blundell,Richard & Newey,Whitney K. & Persson,Torsten (ed.), 2006. "Advances in Economics and Econometrics," Cambridge Books, Cambridge University Press, number 9780521871525.
    7. List, John A, et al, 2001. "Academic Economists Behaving Badly? A Survey on Three Areas of Unethical Behavior," Economic Inquiry, Western Economic Association International, vol. 39(1), pages 162-170, January.
    8. Evanschitzky, Heiner & Baumgarth, Carsten & Hubbard, Raymond & Armstrong, J. Scott, 2007. "Replication research's disturbing trend," Journal of Business Research, Elsevier, vol. 60(4), pages 411-415, April.
    9. Andreas Ortman & Le Zhang, 2013. "Exploring the Meaning of Significance in Experimental Economics," Discussion Papers 2013-32, School of Economics, The University of New South Wales.
    10. Maren Duvendack & Richard W. Palmer-Jones & W. Robert Reed, 2015. "Replications in Economics: A Progress Report," Econ Journal Watch, Econ Journal Watch, vol. 12(2), pages 164–191-1, May.
    11. Andreoni,J. & Harbaugh,W.T., 2005. "Power indices for revealed preference tests," Working papers 10, Wisconsin Madison - Social Systems.
    12. Story C. Landis & Susan G. Amara & Khusru Asadullah & Chris P. Austin & Robi Blumenstein & Eileen W. Bradley & Ronald G. Crystal & Robert B. Darnell & Robert J. Ferrante & Howard Fillit & Robert Finke, 2012. "A call for transparent reporting to optimize the predictive value of preclinical research," Nature, Nature, vol. 490(7419), pages 187-191, October.
    13. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    14. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    15. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    16. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    17. David Card & Stefano DellaVigna & Ulrike Malmendier, 2011. "The Role of Theory in Field Experiments," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 39-62, Summer.
    18. Blundell,Richard & Newey,Whitney K. & Persson,Torsten (ed.), 2006. "Advances in Economics and Econometrics," Cambridge Books, Cambridge University Press, number 9780521692083.
    19. repec:feb:framed:0087 is not listed on IDEAS
    20. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    21. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    22. Zacharias Maniadis & Fabio Tufano & John A. List, 2015. "How to Make Experimental Economics Research More Reproducible: Lessons from Other Disciplines and a New Proposal," Research in Experimental Economics, in: Replication in Experimental Economics, volume 18, pages 215-230, Emerald Group Publishing Limited.
    23. Linda Babcock & George Loewenstein, 1997. "Explaining Bargaining Impasse: The Role of Self-Serving Biases," Journal of Economic Perspectives, American Economic Association, vol. 11(1), pages 109-126, Winter.
    24. Daniele Fanelli, 2010. "Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-7, April.
    25. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    26. repec:feb:artefa:0087 is not listed on IDEAS
    27. Loewenstein, George, 1999. "Experimental Economics from the Vantage-Point of Behavioural Economics," Economic Journal, Royal Economic Society, vol. 109(453), pages 23-34, February.
    28. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2017. "Persuasion Bias in Science: Can Economics Help?," Economic Journal, Royal Economic Society, vol. 127(605), pages 266-304, October.
    29. John Ioannidis & Chris Doucouliagos, 2013. "What'S To Know About The Credibility Of Empirical Economics?," Journal of Economic Surveys, Wiley Blackwell, vol. 27(5), pages 997-1004, December.
    30. T. D. Stanley, 2001. "Wheat from Chaff: Meta-analysis as Quantitative Literature Review," Journal of Economic Perspectives, American Economic Association, vol. 15(3), pages 131-150, Summer.
    31. Chris Doucouliagos & T.D. Stanley, 2013. "Are All Economic Facts Greatly Exaggerated? Theory Competition And Selectivity," Journal of Economic Surveys, Wiley Blackwell, vol. 27(2), pages 316-339, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: Reply to Kataria," Econ Journal Watch, Econ Journal Watch, vol. 11(1), pages 11-16, January.
    3. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    6. Asatryan, Zareh & Havlik, Annika & Heinemann, Friedrich & Nover, Justus, 2020. "Biases in fiscal multiplier estimates," European Journal of Political Economy, Elsevier, vol. 63(C).
    7. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    8. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    9. Sebastian Gechert & Tomas Havranek & Zuzana Irsova & Dominika Kolcunova, 2022. "Measuring Capital-Labor Substitution: The Importance of Method Choices and Publication Bias," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 45, pages 55-82, July.
    10. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    11. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    12. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    13. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    14. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    15. Havranek, Tomas & Rusnak, Marek & Sokolova, Anna, 2017. "Habit formation in consumption: A meta-analysis," European Economic Review, Elsevier, vol. 95(C), pages 142-167.
    16. Soo Hong Chew & Junjian Yi & Junsen Zhang & Songfa Zhong, 2018. "Risk Aversion and Son Preference: Experimental Evidence from Chinese Twin Parents," Management Science, INFORMS, vol. 64(8), pages 3896-3910, August.
    17. Nicolas Vallois & Dorian Jullien, 2017. "Replication in experimental economics: A historical and quantitative approach focused on public good game experiments," Université Paris1 Panthéon-Sorbonne (Post-Print and Working Papers) halshs-01651080, HAL.
    18. Hippolyte W. Balima & Eric G. Kilama & Rene Tapsoba, 2017. "Settling the Inflation Targeting Debate: Lights from a Meta-Regression Analysis," IMF Working Papers 2017/213, International Monetary Fund.
    19. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    20. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2021. "Fifty shades of QE: Comparing findings of central bankers and academics," Journal of Monetary Economics, Elsevier, vol. 120(C), pages 1-20.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:framed:00646. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.