IDEAS home Printed from https://ideas.repec.org/a/wly/iecrev/v61y2020i4p1387-1409.html
   My bibliography  Save this article

2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability

Author

Listed:
  • Omar Al‐Ubaydli
  • John A. List
  • Dana Suskind

Abstract

Policymakers are increasingly facing the challenge of scaling empirical insights. This study provides a theoretical lens into the science of how to use science. Through a simple model, we highlight three elements of the scale‐up problem: (1) when does evidence become actionable; (2) properties of the population; and (3) properties of the situation. Until these three areas are fully understood, the threats to scalability will render any scaling exercise as particularly vulnerable. Accordingly, our work represents a call for more policy‐based evidence, whereby the nature and extent of the various threats to scalability are explored in the original research program.

Suggested Citation

  • Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
  • Handle: RePEc:wly:iecrev:v:61:y:2020:i:4:p:1387-1409
    DOI: 10.1111/iere.12476
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/iere.12476
    Download Restriction: no

    File URL: https://libkey.io/10.1111/iere.12476?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    4. repec:feb:framed:0071 is not listed on IDEAS
    5. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    6. repec:feb:artefa:0087 is not listed on IDEAS
    7. Roland Fryer & Steven Levitt & John List & Sally Sadoff, 2012. "Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment," Framed Field Experiments 00591, The Field Experiments Website.
    8. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    9. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    10. Glenn W. Harrison & John A. List, 2008. "Naturally Occurring Markets and Exogenous Laboratory Experiments: A Case Study of the Winner's Curse," Economic Journal, Royal Economic Society, vol. 118(528), pages 822-843, April.
    11. Glenn W. Harrison & Morten I. Lau & E. Elisabet Rutström, 2007. "Estimating Risk Attitudes in Denmark: A Field Experiment," Scandinavian Journal of Economics, Wiley Blackwell, vol. 109(2), pages 341-368, June.
    12. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    13. repec:mpr:mprres:6758 is not listed on IDEAS
    14. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    15. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    16. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    17. repec:mpr:mprres:6762 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. John List, 2021. "The Voltage Effect in Behavioral Economics," Artefactual Field Experiments 00733, The Field Experiments Website.
    2. John A. List & Ragan Petrie & Anya Samek, 2023. "How Experiments with Children Inform Economics," Journal of Economic Literature, American Economic Association, vol. 61(2), pages 504-564, June.
    3. Omar Al-Ubaydli & Chien-Yu Lai & John A. List, 2023. "A Simple Rational Expectations Model of the Voltage Effect," NBER Working Papers 30850, National Bureau of Economic Research, Inc.
    4. John A. List & Fatemeh Momeni & Yves Zenou, 2020. "The Social Side of Early Human Capital Formation: Using a Field Experiment to Estimate the Causal Impact of Neighborhoods," Working Papers 2020-187, Becker Friedman Institute for Research In Economics.
    5. Felipe Barrera-Osorio & Paul Gertler & Nozomi Nakajima & Harry Patrinos, 2020. "Promoting Parental Involvement in Schools: Evidence From Two Randomized Experiments," NBER Working Papers 28040, National Bureau of Economic Research, Inc.
    6. John List & Julie Pernaudet & Dana Suskind, 2021. "It All Starts with Beliefs: Addressing the Roots of Educational Inequities by Changing Parental Beliefs," Framed Field Experiments 00740, The Field Experiments Website.
    7. Al-Ubaydli, Omar & Cassidy, Alecia & Chatterjee, Anomitro & Khalifa, Ahmed & Price, Michael, 2023. "The power to conserve: a field experiment on electricity use in Qatar," LSE Research Online Documents on Economics 121048, London School of Economics and Political Science, LSE Library.
    8. John A. List & Fatemeh Momeni & Yves Zenou, 2020. "The Social Side of Early Human Capital Formation: Using a Field Experiment to Estimate the Causal Impact of Neighborhoods," Working Papers 2020-187, Becker Friedman Institute for Research In Economics.
    9. Guo, Juncong & Qu, Xi, 2022. "Competition in household human capital investments: Strength, motivations and consequences," Journal of Development Economics, Elsevier, vol. 158(C).
    10. Heller, Sara B., 2022. "When scale and replication work: Learning from summer youth employment experiments," Journal of Public Economics, Elsevier, vol. 209(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    3. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    4. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    5. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    6. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    7. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    8. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    9. John A. List & Michael K. Price, 2016. "Editor's Choice The Use of Field Experiments in Environmental and Resource Economics," Review of Environmental Economics and Policy, Association of Environmental and Resource Economists, vol. 10(2), pages 206-225.
    10. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    11. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    12. Greer K. Gosnell & John A. List & Robert Metcalfe, 2016. "A New Approach to an Age-Old Problem: Solving Externalities by Incenting Workers Directly," NBER Working Papers 22316, National Bureau of Economic Research, Inc.
    13. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    14. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don't know...Can't hurt you?: A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5fhe3c1k6b8, Sciences Po.
    15. Christian A. Vossler, 2016. "Chamberlin Meets Ciriacy-Wantrup: Using Insights from Experimental Economics to Inform Stated Preference Research," Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie, Canadian Agricultural Economics Society/Societe canadienne d'agroeconomie, vol. 64(1), pages 33-48, March.
    16. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2019. "What You Don’t Know…Can’t Hurt You? A Natural Field Experiment on Relative Performance Feedback in Higher Education," Management Science, INFORMS, vol. 65(8), pages 3714-3736, August.
    17. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    18. repec:hal:spmain:info:hdl:2441/5r0qo9lp3v97hptv0tki570p06 is not listed on IDEAS
    19. Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
    20. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    21. John A. List, 2014. "Using Field Experiments to Change the Template of How We Teach Economics," The Journal of Economic Education, Taylor & Francis Journals, vol. 45(2), pages 81-89, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:iecrev:v:61:y:2020:i:4:p:1387-1409. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/deupaus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.