IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/15701.html
   My bibliography  Save this paper

So you want to run an experiment, now what? Some Simple Rules of Thumb for Optimal Experimental Design

Author

Listed:
  • John A. List
  • Sally Sadoff
  • Mathis Wagner

Abstract

Experimental economics represents a strong growth industry. In the past several decades the method has expanded beyond intellectual curiosity, now meriting consideration alongside the other more traditional empirical approaches used in economics. Accompanying this growth is an influx of new experimenters who are in need of straightforward direction to make their designs more powerful. This study provides several simple rules of thumb that researchers can apply to improve the efficiency of their experimental designs. We buttress these points by including empirical examples from the literature.

Suggested Citation

  • John A. List & Sally Sadoff & Mathis Wagner, 2010. "So you want to run an experiment, now what? Some Simple Rules of Thumb for Optimal Experimental Design," NBER Working Papers 15701, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:15701
    Note: EEE LS PE
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w15701.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Jinyong Hahn & Keisuke Hirano & Dean Karlan, 2011. "Adaptive Experimental Design Using the Propensity Score," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 29(1), pages 96-108, January.
    2. Rutström, E. Elisabet & Wilcox, Nathaniel T., 2009. "Stated beliefs versus inferred beliefs: A methodological inquiry and experimental test," Games and Economic Behavior, Elsevier, vol. 67(2), pages 616-632, November.
    3. Camerer, Colin F & Hogarth, Robin M, 1999. "The Effects of Financial Incentives in Experiments: A Review and Capital-Labor-Production Framework," Journal of Risk and Uncertainty, Springer, vol. 19(1-3), pages 7-42, December.
    4. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    5. El-Gamal, Mahmoud A & Palfrey, Thomas R, 1996. "Economical Experiments: Bayesian Efficient Experimental Design," International Journal of Game Theory, Springer;Game Theory Society, vol. 25(4), pages 495-517.
    6. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    7. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    8. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    9. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 6(2), pages 1-47, April.
    10. Lenth R. V., 2001. "Some Practical Guidelines for Effective Sample Size Determination," The American Statistician, American Statistical Association, vol. 55, pages 187-193, August.
    11. Harrison, Glenn W. & Lau, Morten I. & Elisabet Rutström, E., 2009. "Risk attitudes, randomization to treatment, and self-selection into experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 70(3), pages 498-507, June.
    12. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    13. Graham Loomes, 2005. "Modelling the Stochastic Component of Behaviour in Experiments: Some Issues for the Interpretation of Data," Experimental Economics, Springer;Economic Science Association, vol. 8(4), pages 301-323, December.
    14. repec:feb:artefa:0090 is not listed on IDEAS
    15. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    16. John A. List, 2001. "Do Explicit Warnings Eliminate the Hypothetical Bias in Elicitation Procedures? Evidence from Field Auctions for Sportscards," American Economic Review, American Economic Association, vol. 91(5), pages 1498-1507, December.
    17. repec:feb:artefa:0087 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    2. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    3. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    4. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    5. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 6(2), pages 1-47, April.
    8. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    9. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    10. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    11. John List, 2008. "Introduction to field experiments in economics with applications to the economics of charity," Experimental Economics, Springer;Economic Science Association, vol. 11(3), pages 203-212, September.
    12. John List, 2007. "Experimenting with Fish has some Advantages," Artefactual Field Experiments 00387, The Field Experiments Website.
    13. List, John A. & Reiley, David, 2008. "Field Experiments in Economics: Palgrave Entry," IZA Discussion Papers 3273, Institute of Labor Economics (IZA).
    14. Daniel Rondeau & John List, 2008. "Matching and challenge gifts to charity: evidence from laboratory and natural field experiments," Experimental Economics, Springer;Economic Science Association, vol. 11(3), pages 253-267, September.
    15. Ouazad, Amine & Page, Lionel, 2013. "Students' perceptions of teacher biases: Experimental economics in schools," Journal of Public Economics, Elsevier, vol. 105(C), pages 116-130.
    16. Nicolas Jacquemet & Olivier L’Haridon & Isabelle Vialle, 2014. "Marché du travail, évaluation et économie expérimentale," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 189-226.
    17. John A. List, 2014. "Using Field Experiments to Change the Template of How We Teach Economics," The Journal of Economic Education, Taylor & Francis Journals, vol. 45(2), pages 81-89, June.
    18. Buchholz, Matthias & Holst, Gesa & Musshoff, Oliver, 2015. "Water and irrigation policy impact assessment using business simulation games: evidence from northern Germany," Department of Agricultural and Rural Development (DARE) Discussion Papers 260781, Georg-August-Universitaet Goettingen, Department of Agricultural Economics and Rural Development (DARE).
    19. Hans-Martin Gaudecker & Arthur Soest & Erik Wengström, 2012. "Experts in experiments," Journal of Risk and Uncertainty, Springer, vol. 45(2), pages 159-190, October.
    20. Omar Al-Ubaydli & John List, 2012. "On the Generalizability of Experimental Results in Economics," Artefactual Field Experiments 00467, The Field Experiments Website.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:15701. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.