IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/5nsh3.html
   My bibliography  Save this paper

How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics

Author

Listed:
  • Coville, Aidan
  • Vivalt, Eva

Abstract

Under-powered studies combined with low prior beliefs about intervention effects increase the chances that a positive result is overstated. We collect prior beliefs about intervention impacts from 125 experts to estimate the false positive and false negative report probabilities (FPRP and FNRP) as well as Type S (sign) and Type M (magnitude) errors for studies in development economics. We find that the large majority of studies in our sample are generally credible. We discuss how more systematic collection and use of prior expectations could help improve the literature.

Suggested Citation

  • Coville, Aidan & Vivalt, Eva, 2017. "How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics," MetaArXiv 5nsh3, Center for Open Science.
  • Handle: RePEc:osf:metaar:5nsh3
    DOI: 10.31219/osf.io/5nsh3
    as

    Download full text from publisher

    File URL: https://osf.io/download/59910090b83f690252e29b22/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/5nsh3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. David McKenzie & Christopher Woodruff, 2014. "What Are We Learning from Business Training and Entrepreneurship Evaluations around the Developing World?," The World Bank Research Observer, World Bank, vol. 29(1), pages 48-82.
    3. Sarojini Hirshleifer & David McKenzie & Rita Almeida & Cristobal Ridao‐Cano, 2016. "The Impact of Vocational Training for the Unemployed: Experimental Evidence from Turkey," Economic Journal, Royal Economic Society, vol. 126(597), pages 2115-2146, November.
    4. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    5. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    6. Matthew Groh & Nandini Krishnan & David McKenzie & Tara Vishwanath, 2016. "The impact of soft skills training on female youth employment: evidence from a randomized experiment in Jordan," IZA Journal of Labor & Development, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 5(1), pages 1-23, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    3. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    4. Beber, Bernd & Dworschak, Regina & Lakemann, Tabea & Lay, Jann & Priebe, Jan, 2021. "Skills Development and Training Interventions in Africa: Findings, Challenges, and Opportunities," RWI Projektberichte, RWI - Leibniz-Institut für Wirtschaftsforschung, number 247426.
    5. Fox,Louise & Kaul,Upaasna, 2018. "The evidence is in : how should youth employment programs in low-income countries be designed ?," Policy Research Working Paper Series 8500, The World Bank.
    6. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    7. Calderone, Margherita & Fiala, Nathan & Melyoki, Lemayon Lemilia & Schoofs, Annekathrin & Steinacher, Rachel, 2022. "Making intense skills training work at scale: Evidence on business and labor market outcomes in Tanzania," Ruhr Economic Papers 950, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    8. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    9. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    10. Maitra, Pushkar & Mani, Subha, 2017. "Learning and earning: Evidence from a randomized evaluation in India," Labour Economics, Elsevier, vol. 45(C), pages 116-130.
    11. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    12. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    13. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    14. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    15. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    16. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    17. Eliot Abrams & Jonathan Libgober & John A. List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," NBER Working Papers 27250, National Bureau of Economic Research, Inc.
    18. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    19. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    20. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:5nsh3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.