IDEAS home Printed from https://ideas.repec.org/p/osf/lawarx/2b5k4.html
   My bibliography  Save this paper

Replicability in Empirical Legal Research

Author

Listed:
  • Chin, Jason

    (University of Sydney)

  • Zeiler, Kathryn

    (Boston University)

Abstract

As part of a broader methodological reform movement, scientists are increasingly interested in improving the replicability of their research. Replicability allows others to perform replications to explore potential errors and statistical issues that might call the original results into question. Little attention, however, has been paid to the state of replicability in the field of empirical legal research (ELR). Quality is especially important in this field because empirical legal researchers produce work that is regularly relied upon by courts and other legal bodies. In this review article, we summarize the current state of ELR relative to the broader movement towards replicability in the social sciences. As part of that aim, we summarize recent collective replication efforts in ELR and transparency and replicability guidelines adopted by journals that publish ELR. Based on this review, ELR seems to be lagging other fields in implementing reforms. We conclude with suggestions for reforms that might encourage improved replicability.

Suggested Citation

  • Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
  • Handle: RePEc:osf:lawarx:2b5k4
    DOI: 10.31219/osf.io/2b5k4
    as

    Download full text from publisher

    File URL: https://osf.io/download/5feef5601e6d9703512ffa53/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/2b5k4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Freese, Jeremy & Peterson, David, 2017. "Replication in Social Science," SocArXiv 5bck9, Center for Open Science.
    3. Franco, Annie & Malhotra, Neil & Simonovits, Gabor, 2015. "Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results," Political Analysis, Cambridge University Press, vol. 23(2), pages 306-312, April.
    4. Michael A. Clemens, 2017. "The Meaning Of Failed Replications: A Review And Proposal," Journal of Economic Surveys, Wiley Blackwell, vol. 31(1), pages 326-342, February.
    5. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    6. Caspi, Aviv & Stiglitz, Edward H., 2020. "Measuring discourse by algorithm," International Review of Law and Economics, Elsevier, vol. 62(C).
    7. Michael Clemens, 2015. "The Meaning of Failed Replications: A Review and Proposal - Working Paper 399," Working Papers 399, Center for Global Development.
    8. Simine Vazire, 2017. "Our obsession with eminence warps research," Nature, Nature, vol. 547(7661), pages 7-7, July.
    9. Hoeppner, Sven, 2019. "A note on replication analysis," International Review of Law and Economics, Elsevier, vol. 59(C), pages 98-102.
    10. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    11. Rotem Botvinik-Nezer & Felix Holzmeister & Colin F. Camerer & Anna Dreber & Juergen Huber & Magnus Johannesson & Michael Kirchler & Roni Iwanir & Jeanette A. Mumford & R. Alison Adcock & Paolo Avesani, 2020. "Variability in the analysis of a single neuroimaging dataset by many teams," Nature, Nature, vol. 582(7810), pages 84-88, June.
    12. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    13. Dewald, William G & Thursby, Jerry G & Anderson, Richard G, 1986. "Replication in Empirical Economics: The Journal of Money, Credit and Banking Project," American Economic Review, American Economic Association, vol. 76(4), pages 587-603, September.
    14. Nyarko, Julian, 2019. "We’ll See You in . . . Court! The lack of arbitration clauses in international commercial contracts," International Review of Law and Economics, Elsevier, vol. 58(C), pages 6-24.
    15. John J. Donohue, 2015. "Empirical Evaluation of Law: The Dream and the Nightmare," American Law and Economics Review, Oxford University Press, vol. 17(2), pages 313-360.
    16. Hubbard, William H.J., 2019. "A replication study worth replicating: A comment on Salmanowitz and Spamann," International Review of Law and Economics, Elsevier, vol. 58(C), pages 1-2.
    17. B.D. McCullough, 2009. "Open Access Economics Journals and the Market for Reproducible Economic Research," Economic Analysis and Policy, Elsevier, vol. 39(1), pages 117-126, March.
    18. David Moher & Lex Bouter & Sabine Kleinert & Paul Glasziou & Mai Har Sham & Virginia Barbour & Anne-Marie Coriat & Nicole Foeger & Ulrich Dirnagl, 2020. "The Hong Kong Principles for assessing researchers: Fostering research integrity," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-14, July.
    19. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    20. Jeffrey Flier, 2017. "Faculty promotion must assess reproducibility," Nature, Nature, vol. 549(7671), pages 133-133, September.
    21. David Moher & Alessandro Liberati & Jennifer Tetzlaff & Douglas G Altman & The PRISMA Group, 2009. "Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement," PLOS Medicine, Public Library of Science, vol. 6(7), pages 1-6, July.
    22. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    23. John Zhuang Liu & Xueyao Li, 2019. "Legal Techniques for Rationalizing Biased Judicial Decisions: Evidence from Experiments with Real Judges," Journal of Empirical Legal Studies, John Wiley & Sons, vol. 16(3), pages 630-670, September.
    24. Lee Epstein & Andrew D. Martin, 2004. "Does Age (Really) Matter? A Response to Manning, Carroll, and Carp," Social Science Quarterly, Southwestern Social Science Association, vol. 85(1), pages 19-30, March.
    25. Doleac, Jennifer L. & Temple, Chelsea & Pritchard, David & Roberts, Adam, 2020. "Which prisoner reentry programs work? Replicating and extending analyses of three RCTs," International Review of Law and Economics, Elsevier, vol. 62(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    3. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Management Science Reproducibility Collaboration, 2023. "Reproducibility in Management Science," Department for Strategy and Innovation Working Paper Series 03/2023, WU Vienna University of Economics and Business.
    6. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    7. Maren Duvendack & Richard Palmer-Jones & W. Robert Reed, 2017. "What Is Meant by "Replication" and Why Does It Encounter Resistance in Economics?," American Economic Review, American Economic Association, vol. 107(5), pages 46-51, May.
    8. Nicolas Vallois & Dorian Jullien, 2017. "Replication in experimental economics: A historical and quantitative approach focused on public good game experiments," Université Paris1 Panthéon-Sorbonne (Post-Print and Working Papers) halshs-01651080, HAL.
    9. Shaw, Steven D. & Nave, Gideon, 2023. "Don't hate the player, hate the game: Realigning incentive structures to promote robust science and better scientific practices in marketing," Journal of Business Research, Elsevier, vol. 167(C).
    10. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    11. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    12. Nicolas Vallois & Dorian Jullien, 2017. "Replication in Experimental Economics: A Historical and Quantitative Approach Focused on Public Good Game Experiments," GREDEG Working Papers 2017-21, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    13. Christian Zimmermann, 2015. "On the Need for a Replication Journal," Working Papers 2015-16, Federal Reserve Bank of St. Louis.
    14. Christophe Hurlin & Christophe Pérignon, 2020. "Reproducibility Certification in Economics Research," Working Papers hal-02896404, HAL.
    15. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    16. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    17. Sylvérie Herbert & Hautahi Kingi & Flavio Stanchi & Lars Vilhubern, 2021. "The Reproducibility of Economics Research: A Case Study," Working papers 853, Banque de France.
    18. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    19. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    20. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:lawarx:2b5k4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/lawarxiv/discover .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.