IDEAS home Printed from https://ideas.repec.org/p/hal/journl/halshs-03897719.html
   My bibliography  Save this paper

A Practical Guide to Registered Reports for Economists

Author

Listed:
  • Thibaut Arpinon

    (CREM - Centre de recherche en économie et management - UNICAEN - Université de Caen Normandie - NU - Normandie Université - UR - Université de Rennes - CNRS - Centre National de la Recherche Scientifique)

  • Romain Espinosa

    (CIRED - Centre International de Recherche sur l'Environnement et le Développement - Cirad - Centre de Coopération Internationale en Recherche Agronomique pour le Développement - EHESS - École des hautes études en sciences sociales - AgroParisTech - ENPC - École des Ponts ParisTech - Université Paris-Saclay - CNRS - Centre National de la Recherche Scientifique)

Abstract

The current publication system in economics has encouraged the inflation of positive results in empirical papers. Registered Reports, also called Pre-Results Reviews, are a new submission format for empirical work that takes pre-registration one step further. In Registered Reports, researchers write their papers before running the study and commit to a detailed data collection process and analysis plan. After a first-stage review, a journal can give an In-Principle-Acceptance guaranteeing that the paper will be published if the authors carry out their data collection and analysis as pre-specified. We here propose a practical guide to Registered Reports for empirical economists. We illustrate the major problems that Registered Reports address (phacking, HARKing, forking, and publication bias), and present practical guidelines on how to write and review Registered Reports (e.g., the data-analysis plan, power analysis, and correction for multiple-hypothesis testing), with R and STATA codes. We provide specific examples for experimental economics, and show how research design can be improved to maximize statistical power. Last, we discuss some tools that authors, editors, and referees can use to evaluate Registered Reports (checklist, study-design table, and quality assessment).

Suggested Citation

  • Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
  • Handle: RePEc:hal:journl:halshs-03897719
    DOI: 10.1007/s40881-022-00123-1
    Note: View the original document on HAL open archive server: https://shs.hal.science/halshs-03897719
    as

    Download full text from publisher

    File URL: https://shs.hal.science/halshs-03897719/document
    Download Restriction: no

    File URL: https://libkey.io/10.1007/s40881-022-00123-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2016. "Simulating power of economic experiments: the powerBBK package," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 2(2), pages 157-168, November.
    4. Daniele Fanelli, 2010. "“Positive” Results Increase Down the Hierarchy of the Sciences," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-10, April.
    5. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    8. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    9. Ziliak, Stephen T. & McCloskey, Deirdre N., 2004. "Size matters: the standard error of regressions in the American Economic Review," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 33(5), pages 527-546, November.
    10. Damian Clarke & Joseph P. Romano & Michael Wolf, 2020. "The Romano–Wolf multiple-hypothesis correction in Stata," Stata Journal, StataCorp LP, vol. 20(4), pages 812-843, December.
    11. Eliot Abrams & Jonathan Libgober & John List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," Artefactual Field Experiments 00703, The Field Experiments Website.
    12. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    13. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    14. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    15. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    16. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    17. Valentin Amrhein & Sander Greenland & Blake McShane, 2019. "Scientists rise up against statistical significance," Nature, Nature, vol. 567(7748), pages 305-307, March.
    18. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    19. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    20. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    21. Paul J. Ferraro & Pallavi Shukla, 2020. "Feature—Is a Replicability Crisis on the Horizon for Environmental and Resource Economics?," Review of Environmental Economics and Policy, University of Chicago Press, vol. 14(2), pages 339-351.
    22. Christopher D. Chambers & Loukia Tzavella, 2022. "The past, present and future of Registered Reports," Nature Human Behaviour, Nature, vol. 6(1), pages 29-42, January.
    23. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    24. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    25. Romano, Joseph P. & Wolf, Michael, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," Statistics & Probability Letters, Elsevier, vol. 113(C), pages 38-40.
    26. Romain Espinosa & Nicolas Treich, 2021. "Moderate Versus Radical NGOs†," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(4), pages 1478-1501, August.
    27. Marjan Bakker & Coosje L S Veldkamp & Marcel A L M van Assen & Elise A V Crompvoets & How Hwee Ong & Brian A Nosek & Courtney K Soderberg & David Mellor & Jelte M Wicherts, 2020. "Ensuring the quality and specificity of preregistrations," PLOS Biology, Public Library of Science, vol. 18(12), pages 1-18, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Luca A. Panzone & Natasha Auch & Daniel John Zizzo, 2024. "Nudging the Food Basket Green: The Effects of Commitment and Badges on the Carbon Footprint of Food Shopping," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 87(1), pages 89-133, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    2. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Heath, Davidson & Ringgenberg, Matthew C. & Samadi, Mehrdad & Werner, Ingrid M., 2019. "Reusing Natural Experiments," Working Paper Series 2019-21, Ohio State University, Charles A. Dice Center for Research in Financial Economics.
    6. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    7. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    8. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    9. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    10. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    11. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    12. Hermes, Henning & Mierisch, Fabian & Peter, Frauke & Wiederhold, Simon & Lergetporer, Philipp, 2023. "Discrimination on the Child Care Market: A Nationwide Field Experiment," IZA Discussion Papers 16082, Institute of Labor Economics (IZA).
    13. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    14. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    15. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    16. Henning Hermes & Philipp Lergetporer & Fabian Mierisch & Frauke Peter & Simon Wiederhold, 2023. "Discrimination in Universal Social Programs? A Nationwide Field Experiment on Access to Child Care," CESifo Working Paper Series 10368, CESifo.
    17. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    18. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    19. Chowdhury, Shyamal & Hasan, Syed & Sharma, Uttam, 2024. "The Role of Trainee Selection in the Effectiveness of Vocational Training: Evidence from a Randomized Controlled Trial in Nepal," IZA Discussion Papers 16705, Institute of Labor Economics (IZA).
    20. Hermes, Henning & Krauß, Marina & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2022. "Early Child Care and Labor Supply of Lower-SES Mothers: A Randomized Controlled Trial," IZA Discussion Papers 15814, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    Registered Reports; practical guide; pre-registration; p-hacking; HARKing; multiplehypothesis testing; power analysis; the smallest effect size of interest;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:halshs-03897719. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.