IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00766.html
   My bibliography  Save this paper

A Simple Rational Expectations Model Of The Voltage Effect

Author

Listed:
  • Omar Al-Ubaydli
  • Jason Chien-Yu
  • John List

Abstract

The "voltage effect" is defined as the tendency for a program's efficacy to change when it is scaled up, which in most cases results in the absolute size of a program's treatment effects to diminish when the program is scaled. Understanding the scaling problem and taking steps to diminish voltage drops are important because if left unaddressed, the scaling problem can weaken the public's faith in science, and it can lead to a misallocation of public resources. There exists a growing literature illustrating the prevalence of the scaling problem, explaining its causes, and proposing countermeasures. This paper adds to the literature by providing a simple model of the scaling problem that is consistent with rational expectations by the key stakeholders. Our model highlights that asymmetric information is a key contributor to the voltage effect.

Suggested Citation

  • Omar Al-Ubaydli & Jason Chien-Yu & John List, 2023. "A Simple Rational Expectations Model Of The Voltage Effect," Artefactual Field Experiments 00766, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00766
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00766.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    3. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    4. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    5. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    6. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    7. Al-Ubaydli, Omar & Lee, Min Sok & List, John A. & Mackevicius, Claire L. & Suskind, Dana, 2021. "How can experiments play a greater role in public policy? Twelve proposals from an economic model of scaling," Behavioural Public Policy, Cambridge University Press, vol. 5(1), pages 2-49, January.
    8. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    9. Al-Ubaydli, Omar & Lee, Min Sok & List, John A. & Mackevicius, Claire L. & Suskind, Dana, 2021. "A rejoinder: ‘How can experiments play a greater role in public policy? Twelve proposals from an economic model of scaling’," Behavioural Public Policy, Cambridge University Press, vol. 5(1), pages 125-134, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. John List, 2021. "The Voltage Effect in Behavioral Economics," Artefactual Field Experiments 00733, The Field Experiments Website.
    5. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    6. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    7. John A. List & Ragan Petrie & Anya Samek, 2023. "How Experiments with Children Inform Economics," Journal of Economic Literature, American Economic Association, vol. 61(2), pages 504-564, June.
    8. Andrew Dustan & Juan Manuel Hernandez-Agramonte & Stanislao Maldonado, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale," Natural Field Experiments 00664, The Field Experiments Website.
    9. Dustan, Andrew & Hernandez-Agramonte, Juan Manuel & Maldonado, Stanislao, 2023. "Motivating bureaucrats with behavioral insights when state capacity is weak: Evidence from large-scale field experiments in Peru," Journal of Development Economics, Elsevier, vol. 160(C).
    10. Jan-Emmanuel De Neve & Clément Imbert & Johannes Spinnewijn & Teodora Tsankova & Maarten Luts, 2021. "How to Improve Tax Compliance? Evidence from Population-Wide Experiments in Belgium," Journal of Political Economy, University of Chicago Press, vol. 129(5), pages 1425-1463.
    11. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    12. John A. List & Fatemeh Momeni & Yves Zenou, 2020. "The Social Side of Early Human Capital Formation: Using a Field Experiment to Estimate the Causal Impact of Neighborhoods," Working Papers 2020-187, Becker Friedman Institute for Research In Economics.
    13. Jeremy Bowles & Horacio Larreguy, 2019. "Who Debates, Who Wins? At-Scale Experimental Evidence on Debate Participation in a Liberian Election," CID Working Papers 375, Center for International Development at Harvard University.
    14. Bao, Helen X.H. & Robinson, Guy M., 2022. "Behavioural land use policy studies: Past, present, and future," Land Use Policy, Elsevier, vol. 115(C).
    15. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    16. Bowles, Jeremy & Larreguy, Horacio, 2020. "Who Debates, Who Wins? At-Scale Experimental Evidence on the Supply of Policy Information in a Liberian Election," TSE Working Papers 20-1153, Toulouse School of Economics (TSE).
    17. Robert Ammerman & Anne Duggan & John List & Lauren Supplee & Dana Suskind, 2021. "The role of open science practices in scaling evidence-based prevention programs," Natural Field Experiments 00741, The Field Experiments Website.
    18. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.
    19. Monica P. Bhatt & Jonathan Guryan & Jens Ludwig & Anuj K. Shah, 2021. "Scope Challenges to Social Impact," NBER Working Papers 28406, National Bureau of Economic Research, Inc.
    20. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D61 - Microeconomics - - Welfare Economics - - - Allocative Efficiency; Cost-Benefit Analysis
    • D90 - Microeconomics - - Micro-Based Behavioral Economics - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00766. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.