IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v168y2018icp56-60.html
   My bibliography  Save this article

Improving transparency in observational social science research: A pre-analysis plan approach

Author

Listed:
  • Burlig, Fiona

Abstract

Social science research has undergone a credibility revolution, but these gains are at risk due to problematic research practices. Existing research on transparency has centered around randomized controlled trials, which constitute only a small fraction of research in economics. In this paper, I highlight three scenarios in which study preregistration can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data.

Suggested Citation

  • Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
  • Handle: RePEc:eee:ecolet:v:168:y:2018:i:c:p:56-60
    DOI: 10.1016/j.econlet.2018.03.036
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176518301277
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2018.03.036?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Anderson, Michael L, 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Department of Agricultural & Resource Economics, UC Berkeley, Working Paper Series qt15n8j26f, Department of Agricultural & Resource Economics, UC Berkeley.
    2. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    5. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    6. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    7. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    8. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    9. David Card & Stefano DellaVigna & Ulrike Malmendier, 2011. "The Role of Theory in Field Experiments," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 39-62, Summer.
    10. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    11. Amy Finkelstein & Sarah Taubman & Bill Wright & Mira Bernstein & Jonathan Gruber & Joseph P. Newhouse & Heidi Allen & Katherine Baicker, 2012. "The Oregon Health Insurance Experiment: Evidence from the First Year," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(3), pages 1057-1106.
    12. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    13. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    14. repec:bla:econom:v:47:y:1980:i:188:p:387-406 is not listed on IDEAS
    15. David Neumark, 1999. "The Employment Effects of Recent Minimum Wage Increases: Evidence from a Pre-specified Research Design," NBER Working Papers 7171, National Bureau of Economic Research, Inc.
    16. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    17. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bensch, Gunther & Ankel-Peters, Jörg & Vance, Colin, 2023. "Spotlight on Researcher Decisions – Infrastructure Evaluation, Instrumental Variables, and Specification Screening," VfS Annual Conference 2023 (Regensburg): Growth and the "sociale Frage" 277703, Verein für Socialpolitik / German Economic Association.
    2. Carina Neisser, 2021. "The Elasticity of Taxable Income: A Meta-Regression Analysis [The top 1% in international and historical perspective]," The Economic Journal, Royal Economic Society, vol. 131(640), pages 3365-3391.
    3. Ankel-Peters, Jörg & Bruederle, Anna & Roberts, Gareth, 2022. "Weather and Crime—Cautious evidence from South Africa," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 3(1), pages 1-22.
    4. Martin Brown & Nicole Hentschel & Hannes Mettler & Helmut Stix, 2020. "Financial Innovation, Payment Choice and Cash Demand – Causal Evidence from the Staggered Introduction of Contactless Debit Cards (Martin Brown,Nicole Hentschel, Hannes Mettler, Helmut Stix)," Working Papers 230, Oesterreichische Nationalbank (Austrian Central Bank).
    5. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," GLO Discussion Paper Series 1147, Global Labor Organization (GLO).
    6. Clemens, Jeffrey & Strain, Michael R., 2021. "The Heterogeneous Effects of Large and Small Minimum Wage Changes: Evidence over the Short and Medium Run Using a Pre-analysis Plan," IZA Discussion Papers 14747, Institute of Labor Economics (IZA).
    7. Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    8. Brown, Martin & Hentschel, Nicole & Mettler, Hannes & Stix, Helmut, 2022. "The convenience of electronic payments and consumer cash demand," Journal of Monetary Economics, Elsevier, vol. 130(C), pages 86-102.
    9. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    10. Joel Bank & Hamish Fitchett & Adam Gorajek & Benjamin Malin & Andrew Staib, 2021. "Star Wars at Central Banks," Reserve Bank of New Zealand Discussion Paper Series DP2021/01, Reserve Bank of New Zealand.
    11. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    12. Emilio Depetris-Chauvin & Felipe González, 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Documentos de Trabajo 572, Instituto de Economia. Pontificia Universidad Católica de Chile..
    13. Abel Brodeur & Nikolai M. Cook & Jonathan S. Hartley & Anthony Heyes, 2024. "Do Preregistration and Preanalysis Plans Reduce p-Hacking and Publication Bias? Evidence from 15,992 Test Statistics and Suggestions for Improvement," Journal of Political Economy Microeconomics, University of Chicago Press, vol. 2(3), pages 527-561.
    14. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    15. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    16. Daniel Gilfillan & Stacy-ann Robinson & Hannah Barrowman, 2020. "Action Research to Enhance Inter-Organisational Coordination of Climate Change Adaptation in the Pacific," Challenges, MDPI, vol. 11(1), pages 1-24, May.
    17. Travis J. Lybbert & Steven T. Buccola, 2021. "The evolving ethics of analysis, publication, and transparency in applied economics," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1330-1351, December.
    18. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    19. Martin Brown & Nicole Hentschel & Hannes Mettler & Helmut Stix, 2020. "Financial Innovation, Payment Choice and Cash Demand - Causal Evidence from the Staggered Introduction of Contactless Debit Cards," Working Papers on Finance 2002, University of St. Gallen, School of Finance.
    20. Ankel-Peters, Jörg & Vance, Colin & Bensch, Gunther, 2022. "Spotlight on researcher decisions – Infrastructure evaluation, instrumental variables, and first-stage specification screening," OSF Preprints sw6kd, Center for Open Science.
    21. , 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Working Papers 953, Queen Mary University of London, School of Economics and Finance.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    4. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    5. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    6. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    7. Opoku-Agyemang, Kweku A., 2017. "A Human-Computer Interaction Approach for Integrity in Economics," SocArXiv ra3cs, Center for Open Science.
    8. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    9. Lars Ivar Oppedal Berge & Kjetil Bjorvatn & Simon Galle & Edward Miguel & Daniel N. Posner & Bertil Tungodden & Kelly Zhang, 2015. "How Strong are Ethnic Preferences?," NBER Working Papers 21715, National Bureau of Economic Research, Inc.
    10. Damgaard, Mette Trier & Gravert, Christina, 2018. "The hidden costs of nudging: Experimental evidence from reminders in fundraising," Journal of Public Economics, Elsevier, vol. 157(C), pages 15-26.
    11. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    12. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.
    13. Leah H. Palm-Forster & Paul J. Ferraro & Nicholas Janusch & Christian A. Vossler & Kent D. Messer, 2019. "Behavioral and Experimental Agri-Environmental Research: Methodological Challenges, Literature Gaps, and Recommendations," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 719-742, July.
    14. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    15. Steinert, Janina I. & Zenker, Juliane & Filipiak, Ute & Movsisyan, Ani & Cluver, Lucie D. & Shenderovich, Yulia, 2018. "Do saving promotion interventions increase household savings, consumption, and investments in Sub-Saharan Africa? A systematic review and meta-analysis," World Development, Elsevier, vol. 104(C), pages 238-256.
    16. King, Elisabeth & Samii, Cyrus, 2014. "Fast-Track Institution Building in Conflict-Affected Countries? Insights from Recent Field Experiments," World Development, Elsevier, vol. 64(C), pages 740-754.
    17. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    18. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    19. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Apr 2024.
    20. Heath, Davidson & Ringgenberg, Matthew C. & Samadi, Mehrdad & Werner, Ingrid M., 2019. "Reusing Natural Experiments," Working Paper Series 2019-21, Ohio State University, Charles A. Dice Center for Research in Financial Economics.

    More about this item

    Keywords

    Transparency; Pre-registration; Observational research; Confidential data;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:168:y:2018:i:c:p:56-60. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.