IDEAS home Printed from https://ideas.repec.org/a/eee/ecolet/v168y2018icp56-60.html
   My bibliography  Save this article

Improving transparency in observational social science research: A pre-analysis plan approach

Author

Listed:
  • Burlig, Fiona

Abstract

Social science research has undergone a credibility revolution, but these gains are at risk due to problematic research practices. Existing research on transparency has centered around randomized controlled trials, which constitute only a small fraction of research in economics. In this paper, I highlight three scenarios in which study preregistration can be credibly applied in non-experimental settings: cases where researchers collect their own data; prospective studies; and research using restricted-access data.

Suggested Citation

  • Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
  • Handle: RePEc:eee:ecolet:v:168:y:2018:i:c:p:56-60
    DOI: 10.1016/j.econlet.2018.03.036
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0165176518301277
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.econlet.2018.03.036?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    4. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, Oxford University Press, vol. 127(4), pages 1755-1812.
    5. Leamer, Edward E, 1983. "Let's Take the Con Out of Econometrics," American Economic Review, American Economic Association, vol. 73(1), pages 31-43, March.
    6. Amy Finkelstein & Sarah Taubman & Bill Wright & Mira Bernstein & Jonathan Gruber & Joseph P. Newhouse & Heidi Allen & Katherine Baicker, 2012. "The Oregon Health Insurance Experiment: Evidence from the First Year," The Quarterly Journal of Economics, Oxford University Press, vol. 127(3), pages 1057-1106.
    7. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    8. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    9. Anderson, Michael L., 2008. "Multiple Inference and Gender Differences in the Effects of Early Intervention: A Reevaluation of the Abecedarian, Perry Preschool, and Early Training Projects," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1481-1495.
    10. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    11. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    12. David Card & Stefano DellaVigna & Ulrike Malmendier, 2011. "The Role of Theory in Field Experiments," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 39-62, Summer.
    13. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    14. Hendry, David F, 1980. "Econometrics-Alchemy or Science?," Economica, London School of Economics and Political Science, vol. 47(188), pages 387-406, November.
    15. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    16. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    17. David Neumark, 1999. "The Employment Effects of Recent Minimum Wage Increases: Evidence from a Pre-specified Research Design," NBER Working Papers 7171, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Carina Neisser, 2017. "The elasticity of taxable income: A meta-regression analysis," Working Papers 2017/10, Institut d'Economia de Barcelona (IEB).
    2. Brüderle, Mirjam Anna & Peters, Jörg & Roberts, Gareth, 2022. "Weather and crime: Cautious evidence from South Africa," Ruhr Economic Papers 940, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    3. Martin Brown & Nicole Hentschel & Hannes Mettler & Helmut Stix, 2020. "Financial Innovation, Payment Choice and Cash Demand - Causal Evidence from the Staggered Introduction of Contactless Debit Cards," Working Papers on Finance 2002, University of St. Gallen, School of Finance.
    4. Clemens, Jeffrey & Strain, Michael R., 2021. "The Heterogeneous Effects of Large and Small Minimum Wage Changes: Evidence over the Short and Medium Run Using a Pre-analysis Plan," IZA Discussion Papers 14747, Institute of Labor Economics (IZA).
    5. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    6. Joel Bank & Hamish Fitchett & Adam Gorajek & Benjamin A. Malin & Andrew Staib, 2021. "Star Wars at Central Banks," Staff Report 620, Federal Reserve Bank of Minneapolis.
    7. Ofosu, George K. & Posner, Daniel N., 2020. "Do pre-analysis plans hamper publication?," LSE Research Online Documents on Economics 112748, London School of Economics and Political Science, LSE Library.
    8. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    9. Daniel Gilfillan & Stacy-ann Robinson & Hannah Barrowman, 2020. "Action Research to Enhance Inter-Organisational Coordination of Climate Change Adaptation in the Pacific," Challenges, MDPI, vol. 11(1), pages 1-24, May.
    10. Travis J. Lybbert & Steven T. Buccola, 2021. "The evolving ethics of analysis, publication, and transparency in applied economics," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1330-1351, December.
    11. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    12. Adam Gorajek & Joel Bank & Andrew Staib & Benjamin Malin & Hamish Fitchett, 2021. "Star Wars at Central Banks," RBA Research Discussion Papers rdp2021-02, Reserve Bank of Australia.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Christensen, Garret & Miguel, Edward & Sturdy, Jennifer, 2017. "Transparency, Reproducibility, and the Credibility of Economics Research," MetaArXiv 9a3rw, Center for Open Science.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    5. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    6. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.
    7. Opoku-Agyemang, Kweku A., 2017. "A Human-Computer Interaction Approach for Integrity in Economics," SocArXiv ra3cs, Center for Open Science.
    8. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    9. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    11. Stephan B. Bruns, 2013. "Identifying Genuine Effects in Observational Research by Means of Meta-Regressions," Jena Economic Research Papers 2013-040, Friedrich-Schiller-University Jena.
    12. Cristina Blanco-Perez & Abel Brodeur, 2019. "Transparency in empirical economic research," IZA World of Labor, Institute of Labor Economics (IZA), pages 467-467, November.
    13. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    14. Johnson, Samuel G. B., 2019. "Toward a cognitive science of markets: Economic agents as sense-makers," Economics Discussion Papers 2019-10, Kiel Institute for the World Economy (IfW Kiel).
    15. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," World Bank Economic Review, World Bank Group, vol. 31(3), pages 687-707.
    16. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    17. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    18. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    19. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    20. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.

    More about this item

    Keywords

    Transparency; Pre-registration; Observational research; Confidential data;
    All these keywords.

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ecolet:v:168:y:2018:i:c:p:56-60. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.elsevier.com/locate/ecolet .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ecolet .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.