IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00616.html
   My bibliography  Save this paper

Scaling for economists: lessons from the non-adherence problem in the medical literature

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Danielle LoRe
  • Dana Suskind

Abstract

Economists often conduct experiments that demonstrate the benefits to individuals of modifying their behavior, such as using a new production process at work or investing in energy saving technologies. A common occurrence is for the success of the intervention in these small-scale studies to diminish substantially when applied at a larger scale, severely undermining the optimism advertised in the original research studies. One key contributor to the lack of general success is that the change that has been demonstrated to be beneficial is not adopted to the extent that would be optimal. This problem is isomorphic to the problem of patient non-adherence to medications that are known to be effective. The large medical literature on countermeasures furnishes economists with potential remedies to this manifestation of the scaling problem.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Omar Al-Ubaydli & John List & Danielle LoRe & Dana Suskind, 2017. "Scaling for economists: lessons from the non-adherence problem in the medical literature," Artefactual Field Experiments 00616, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00616
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00616.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Esther Duflo & Michael Kremer & Jonathan Robinson, 2011. "Nudging Farmers to Use Fertilizer: Theory and Experimental Evidence from Kenya," American Economic Review, American Economic Association, vol. 101(6), pages 2350-2390, October.
    2. Chang, Eric C. & Cheng, Joseph W. & Khorana, Ajay, 2000. "An examination of herd behavior in equity markets: An international perspective," Journal of Banking & Finance, Elsevier, vol. 24(10), pages 1651-1679, October.
    3. Richard H. Thaler & Shlomo Benartzi, 2004. "Save More Tomorrow (TM): Using Behavioral Economics to Increase Employee Saving," Journal of Political Economy, University of Chicago Press, vol. 112(S1), pages 164-187, February.
    4. Roland G. Fryer, 2011. "Financial Incentives and Student Achievement: Evidence from Randomized Trials," The Quarterly Journal of Economics, Oxford University Press, vol. 126(4), pages 1755-1798.
    5. Stefano DellaVigna & John A. List & Ulrike Malmendier, 2012. "Testing for Altruism and Social Pressure in Charitable Giving," The Quarterly Journal of Economics, Oxford University Press, vol. 127(1), pages 1-56.
    6. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    7. Tanjim Hossain & John A. List, 2012. "The Behavioralist Visits the Factory: Increasing Productivity Using Simple Framing Manipulations," Management Science, INFORMS, vol. 58(12), pages 2151-2167, December.
    8. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    9. Richard Watt & Francisco Vázquez & Ignacio Moreno, 2001. "An Experiment on Rational Insurance Decisions," Theory and Decision, Springer, vol. 51(2), pages 247-296, December.
    10. Gosnell, Greer & Metcalfe, Robert & List, John A, 2016. "A new approach to an age-old problem: solving externalities by incenting workers directly," LSE Research Online Documents on Economics 84331, London School of Economics and Political Science, LSE Library.
    11. Omar Al-Ubaydli & John List, 2012. "On the Generalizability of Experimental Results in Economics," Artefactual Field Experiments 00467, The Field Experiments Website.
    12. Roland G. Fryer, Jr. & Steven D. Levitt & John A. List, 2015. "Parental Incentives and Early Childhood Achievement: A Field Experiment in Chicago Heights," NBER Working Papers 21477, National Bureau of Economic Research, Inc.
    13. Paul J. Ferraro & Michael K. Price, 2013. "Using Nonpecuniary Strategies to Influence Behavior: Evidence from a Large-Scale Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 95(1), pages 64-73, March.
    14. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    15. Omar Al-Ubaydli & John A. List, 2016. "Field Experiments in Markets," NBER Working Papers 22113, National Bureau of Economic Research, Inc.
    16. Erica Field & Rohini Pande, 2008. "Repayment Frequency and Default in Microfinance: Evidence From India," Journal of the European Economic Association, MIT Press, vol. 6(2-3), pages 501-509, 04-05.
    17. Shlomo Benartzi & Richard Thaler, 2004. "Save more tomorrow: Using behavioral economics to increase employee saving," Natural Field Experiments 00337, The Field Experiments Website.
    18. Nava Ashraf & James Berry & Jesse M. Shapiro, 2010. "Can Higher Prices Stimulate Product Use? Evidence from a Field Experiment in Zambia," American Economic Review, American Economic Association, vol. 100(5), pages 2383-2413, December.
    19. Daniel Kahneman & Jack L. Knetsch & Richard H. Thaler, 1991. "Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias," Journal of Economic Perspectives, American Economic Association, vol. 5(1), pages 193-206, Winter.
    20. Roland G. Fryer, 2011. "Teacher Incentives and Student Achievement: Evidence from New York City Public Schools," NBER Working Papers 16850, National Bureau of Economic Research, Inc.
    21. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    22. Bruce Shearer, 2004. "Piece Rates, Fixed Wages and Incentives: Evidence from a Field Experiment," Review of Economic Studies, Oxford University Press, vol. 71(2), pages 513-534.
    23. David Laibson, 1997. "Golden Eggs and Hyperbolic Discounting," The Quarterly Journal of Economics, Oxford University Press, vol. 112(2), pages 443-478.
    24. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    25. repec:feb:framed:0087 is not listed on IDEAS
    26. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    27. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. Sam Watson’s journal round-up for 13th November 2017
      by Sam Watson in The Academic Health Economists' Blog on 2017-11-13 18:33:33

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.

    More about this item

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • I11 - Health, Education, and Welfare - - Health - - - Analysis of Health Care Markets

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00616. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Joe Seidel). General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.