IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/22566.html

Predicting Experimental Results: Who Knows What?

Author

Listed:
  • Stefano DellaVigna
  • Devin Pope

Abstract

Academic experts frequently recommend policies and treatments. But how well do they anticipate the impact of different treatments? And how do their predictions compare to the predictions of non-experts? We analyze how 208 experts forecast the results of 15 treatments involving monetary and non-monetary motivators in a real-effort task. We compare these forecasts to those made by PhD students and non-experts: undergraduates, MBAs, and an online sample. We document seven main results. First, the average forecast of experts predicts quite well the experimental results. Second, there is a strong wisdom-of-crowds effect: the average forecast outperforms 96 percent of individual forecasts. Third, correlates of expertise---citations, academic rank, field, and contextual experience--do not improve forecasting accuracy. Fourth, experts as a group do better than non-experts, but not if accuracy is defined as rank ordering treatments. Fifth, measures of effort, confidence, and revealed ability are predictive of forecast accuracy to some extent, especially for non-experts. Sixth, using these measures we identify `superforecasters' among the non-experts who outperform the experts out of sample. Seventh, we document that these results on forecasting accuracy surprise the forecasters themselves. We present a simple model that organizes several of these results and we stress the implications for the collection of forecasts of future experimental results.

Suggested Citation

  • Stefano DellaVigna & Devin Pope, 2016. "Predicting Experimental Results: Who Knows What?," NBER Working Papers 22566, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:22566
    Note: DEV ED EH LS PE PR
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w22566.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Gabriele Paolacci & Jesse Chandler & Panagiotis G. Ipeirotis, 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(5), pages 411-419, August.
    2. Alberto Cavallo & Guillermo Cruces & Ricardo Perez-Truglia, 2017. "Inflation Expectations, Learning, and Supermarket Prices: Evidence from Survey Experiments," American Economic Journal: Macroeconomics, American Economic Association, vol. 9(3), pages 1-35, July.
    3. Kahneman, Daniel & Schkade, David & Sunstein, Cass R, 1998. "Shared Outrage and Erratic Awards: The Psychology of Punitive Damages," Journal of Risk and Uncertainty, Springer, vol. 16(1), pages 49-86, April.
    4. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    5. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    6. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    7. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    8. Jonah Berger & Devin Pope, 2011. "Can Losing Lead to Winning?," Management Science, INFORMS, vol. 57(5), pages 817-827, May.
    9. Roth, Alvin E. & Herzog, Stefan & Hau, Robin & Hertwig, Ralph & Erev, Ido & Ert, Eyal & Haruvy, Ernan & Stewart, Terrence & West, Robert & Lebiere, Christian, 2009. "A Choice Prediction Competition: Choices From Experience and From Description," Scholarly Articles 5343169, Harvard University Department of Economics.
    10. Erik Snowberg & Justin Wolfers & Eric Zitzewitz, 2007. "Partisan Impacts on the Economy: Evidence from Prediction Markets and Close Elections," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(2), pages 807-829.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stefano DellaVigna & Devin Pope, 2022. "Stability of Experimental Results: Forecasts and Evidence," American Economic Journal: Microeconomics, American Economic Association, vol. 14(3), pages 889-925, August.
    2. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    3. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    4. Sebastian Fest & Ola Kvaløy & Petra Nieken & Anja Schöttner, 2019. "Motivation and incentives in an online labor market," CESifo Working Paper Series 7526, CESifo.
    5. Zhang, Yinjunjie & Hoffmann, Manuel & Sara, Raisa & Eckel, Catherine, 2024. "Fairness preferences revisited," Journal of Economic Behavior & Organization, Elsevier, vol. 223(C), pages 278-306.
    6. Martin Abel, 2024. "Do Workers Discriminate against Female Bosses?," Journal of Human Resources, University of Wisconsin Press, vol. 59(2), pages 470-501.
    7. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    8. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    9. Wladislaw Mill & Jonathan Staebler, 2023. "Spite in Litigation," CRC TR 224 Discussion Paper Series crctr224_2023_401, University of Bonn and University of Mannheim, Germany.
    10. Doerrenberg, Philipp & Duncan, Denvil & Löffler, Max, 2023. "Asymmetric labor-supply responses to wage changes: Experimental evidence from an online labor market," Labour Economics, Elsevier, vol. 81(C).
    11. Marcus Giamattei & Kyanoush Seyed Yahosseini & Simon Gächter & Lucas Molleman, 2020. "LIONESS Lab: a free web-based platform for conducting interactive experiments online," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 95-111, June.
    12. Doerrenberg, Philipp & Duncan, Denvil & Li, Danyang, 2024. "The (in)visible hand: Do workers discriminate against employers?," Journal of Public Economics, Elsevier, vol. 231(C).
    13. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    14. Mariconda, Simone & Lurati, Francesco, 2015. "Does familiarity breed stability? The role of familiarity in moderating the effects of new information on reputation judgments," Journal of Business Research, Elsevier, vol. 68(5), pages 957-964.
    15. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    16. Gökçe Esenduran & James A. Hill & In Joon Noh, 2020. "Understanding the Choice of Online Resale Channel for Used Electronics," Production and Operations Management, Production and Operations Management Society, vol. 29(5), pages 1188-1211, May.
    17. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    18. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    19. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    20. Cook, Nikolai & Heyes, Anthony, 2022. "Pollution pictures: Psychological exposure to pollution impacts worker productivity in a large-scale field experiment," Journal of Environmental Economics and Management, Elsevier, vol. 114(C).

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:22566. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.