IDEAS home Printed from https://ideas.repec.org/a/spr/jesaex/v9y2023i1d10.1007_s40881-022-00123-1.html
   My bibliography  Save this article

A practical guide to Registered Reports for economists

Author

Listed:
  • Thibaut Arpinon

    (CREM, University of Rennes 1)

  • Romain Espinosa

    (CIRED, CNRS)

Abstract

The current publication system in economics has encouraged the inflation of positive results in empirical papers. Registered Reports, also called Pre-Results Reviews, are a new submission format for empirical work that takes pre-registration one step further. In Registered Reports, researchers write their papers before running the study and commit to a detailed data collection process and analysis plan. After a first-stage review, a journal can give an In-Principle-Acceptance guaranteeing that the paper will be published if the authors carry out their data collection and analysis as pre-specified. We here propose a practical guide to Registered Reports for empirical economists. We illustrate the major problems that Registered Reports address (p-hacking, HARKing, forking, and publication bias), and present practical guidelines on how to write and review Registered Reports (e.g., the data-analysis plan, power analysis, and correction for multiple-hypothesis testing), with R and STATA codes. We provide specific examples for experimental economics, and show how research design can be improved to maximize statistical power. Last, we discuss some tools that authors, editors, and referees can use to evaluate Registered Reports (checklist, study-design table, and quality assessment).

Suggested Citation

  • Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
  • Handle: RePEc:spr:jesaex:v:9:y:2023:i:1:d:10.1007_s40881-022-00123-1
    DOI: 10.1007/s40881-022-00123-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40881-022-00123-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40881-022-00123-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    2. Stephen T. Ziliak & Deirdre N. McCloskey, 2004. "Size Matters: The Standard Error of Regressions in the American Economic Review," Econ Journal Watch, Econ Journal Watch, vol. 1(2), pages 331-358, August.
    3. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    4. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    5. Eliot Abrams & Jonathan Libgober & John A. List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," NBER Working Papers 27250, National Bureau of Economic Research, Inc.
    6. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    7. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    8. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    9. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    10. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2016. "Simulating power of economic experiments: the powerBBK package," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 2(2), pages 157-168, November.
    11. Daniele Fanelli, 2010. "“Positive” Results Increase Down the Hierarchy of the Sciences," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-10, April.
    12. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    13. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    14. Paul J. Ferraro & Pallavi Shukla, 2020. "Feature—Is a Replicability Crisis on the Horizon for Environmental and Resource Economics?," Review of Environmental Economics and Policy, University of Chicago Press, vol. 14(2), pages 339-351.
    15. Christopher D. Chambers & Loukia Tzavella, 2022. "The past, present and future of Registered Reports," Nature Human Behaviour, Nature, vol. 6(1), pages 29-42, January.
    16. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    17. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    18. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    19. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    20. Damian Clarke & Joseph P. Romano & Michael Wolf, 2020. "The Romano–Wolf multiple-hypothesis correction in Stata," Stata Journal, StataCorp LP, vol. 20(4), pages 812-843, December.
    21. Jonathan W. Schooler, 2014. "Metascience could rescue the ‘replication crisis’," Nature, Nature, vol. 515(7525), pages 9-9, November.
    22. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    23. Romano, Joseph P. & Wolf, Michael, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," Statistics & Probability Letters, Elsevier, vol. 113(C), pages 38-40.
    24. Romain Espinosa & Nicolas Treich, 2021. "Moderate Versus Radical NGOs†," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(4), pages 1478-1501, August.
    25. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    26. Marjan Bakker & Coosje L S Veldkamp & Marcel A L M van Assen & Elise A V Crompvoets & How Hwee Ong & Brian A Nosek & Courtney K Soderberg & David Mellor & Jelte M Wicherts, 2020. "Ensuring the quality and specificity of preregistrations," PLOS Biology, Public Library of Science, vol. 18(12), pages 1-18, December.
    27. Valentin Amrhein & Sander Greenland & Blake McShane, 2019. "Scientists rise up against statistical significance," Nature, Nature, vol. 567(7748), pages 305-307, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Luca A. Panzone & Natasha Auch & Daniel John Zizzo, 2024. "Nudging the Food Basket Green: The Effects of Commitment and Badges on the Carbon Footprint of Food Shopping," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 87(1), pages 89-133, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    2. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Heath, Davidson & Ringgenberg, Matthew C. & Samadi, Mehrdad & Werner, Ingrid M., 2019. "Reusing Natural Experiments," Working Paper Series 2019-21, Ohio State University, Charles A. Dice Center for Research in Financial Economics.
    6. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    7. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    8. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    9. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    10. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    11. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    12. Hermes, Henning & Mierisch, Fabian & Peter, Frauke & Wiederhold, Simon & Lergetporer, Philipp, 2023. "Discrimination on the Child Care Market: A Nationwide Field Experiment," IZA Discussion Papers 16082, Institute of Labor Economics (IZA).
    13. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    14. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    15. Henning Hermes & Philipp Lergetporer & Fabian Mierisch & Frauke Peter & Simon Wiederhold, 2023. "Discrimination in Universal Social Programs? A Nationwide Field Experiment on Access to Child Care," CESifo Working Paper Series 10368, CESifo.
    16. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    17. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    18. Chowdhury, Shyamal & Hasan, Syed & Sharma, Uttam, 2024. "The Role of Trainee Selection in the Effectiveness of Vocational Training: Evidence from a Randomized Controlled Trial in Nepal," IZA Discussion Papers 16705, Institute of Labor Economics (IZA).
    19. Hermes, Henning & Krauß, Marina & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2022. "Early Child Care and Labor Supply of Lower-SES Mothers: A Randomized Controlled Trial," IZA Discussion Papers 15814, Institute of Labor Economics (IZA).
    20. Roggenkamp, Hauke C., 2024. "Revisiting ‘Growth and Inequality in Public Good Provision’—Reproducing and Generalizing Through Inconvenient Online Experimentation," OSF Preprints 6rn97, Center for Open Science.

    More about this item

    Keywords

    Registered Reports; Practical guide; Pre-registration; p-hacking; HARKing; Multiple-hypothesis testing; Power analysis; The smallest effect size of interest;
    All these keywords.

    JEL classification:

    • A10 - General Economics and Teaching - - General Economics - - - General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jesaex:v:9:y:2023:i:1:d:10.1007_s40881-022-00123-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.