IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/22566.html
   My bibliography  Save this paper

Predicting Experimental Results: Who Knows What?

Author

Listed:
  • Stefano DellaVigna
  • Devin Pope

Abstract

Academic experts frequently recommend policies and treatments. But how well do they anticipate the impact of different treatments? And how do their predictions compare to the predictions of non-experts? We analyze how 208 experts forecast the results of 15 treatments involving monetary and non-monetary motivators in a real-effort task. We compare these forecasts to those made by PhD students and non-experts: undergraduates, MBAs, and an online sample. We document seven main results. First, the average forecast of experts predicts quite well the experimental results. Second, there is a strong wisdom-of-crowds effect: the average forecast outperforms 96 percent of individual forecasts. Third, correlates of expertise---citations, academic rank, field, and contextual experience--do not improve forecasting accuracy. Fourth, experts as a group do better than non-experts, but not if accuracy is defined as rank ordering treatments. Fifth, measures of effort, confidence, and revealed ability are predictive of forecast accuracy to some extent, especially for non-experts. Sixth, using these measures we identify `superforecasters' among the non-experts who outperform the experts out of sample. Seventh, we document that these results on forecasting accuracy surprise the forecasters themselves. We present a simple model that organizes several of these results and we stress the implications for the collection of forecasts of future experimental results.

Suggested Citation

  • Stefano DellaVigna & Devin Pope, 2016. "Predicting Experimental Results: Who Knows What?," NBER Working Papers 22566, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:22566
    Note: DEV ED HC LS PE PR
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w22566.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Erik Snowberg & Justin Wolfers & Eric Zitzewitz, 2007. "Partisan Impacts on the Economy: Evidence from Prediction Markets and Close Elections," The Quarterly Journal of Economics, Oxford University Press, vol. 122(2), pages 807-829.
    3. Alberto Cavallo & Guillermo Cruces & Ricardo Perez-Truglia, 2017. "Inflation Expectations, Learning, and Supermarket Prices: Evidence from Survey Experiments," American Economic Journal: Macroeconomics, American Economic Association, vol. 9(3), pages 1-35, July.
    4. Kahneman, Daniel & Schkade, David & Sunstein, Cass R, 1998. "Shared Outrage and Erratic Awards: The Psychology of Punitive Damages," Journal of Risk and Uncertainty, Springer, vol. 16(1), pages 49-86, April.
    5. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    6. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," Review of Economic Studies, Oxford University Press, vol. 85(2), pages 1029-1069.
    7. Jonah Berger & Devin Pope, 2011. "Can Losing Lead to Winning?," Management Science, INFORMS, vol. 57(5), pages 817-827, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alice Hsiaw & Ing-Haw Cheng, 2016. "Distrust in Experts and the Origins of Disagreement," Working Papers 110, Brandeis University, Department of Economics and International Businesss School.
    2. Eszter Czibor & David Jimenez-Gomez & John List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Artefactual Field Experiments 00648, The Field Experiments Website.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:22566. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (). General contact details of provider: http://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.