IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0248780.html
   My bibliography  Save this article

Predicting replicability—Analysis of survey and prediction market data from large-scale forecasting projects

Author

Listed:
  • Michael Gordon
  • Domenico Viganola
  • Anna Dreber
  • Magnus Johannesson
  • Thomas Pfeiffer

Abstract

The reproducibility of published research has become an important topic in science policy. A number of large-scale replication projects have been conducted to gauge the overall reproducibility in specific academic fields. Here, we present an analysis of data from four studies which sought to forecast the outcomes of replication projects in the social and behavioural sciences, using human experts who participated in prediction markets and answered surveys. Because the number of findings replicated and predicted in each individual study was small, pooling the data offers an opportunity to evaluate hypotheses regarding the performance of prediction markets and surveys at a higher power. In total, peer beliefs were elicited for the replication outcomes of 103 published findings. We find there is information within the scientific community about the replicability of scientific findings, and that both surveys and prediction markets can be used to elicit and aggregate this information. Our results show prediction markets can determine the outcomes of direct replications with 73% accuracy (n = 103). Both the prediction market prices, and the average survey responses are correlated with outcomes (0.581 and 0.564 respectively, both p

Suggested Citation

  • Michael Gordon & Domenico Viganola & Anna Dreber & Magnus Johannesson & Thomas Pfeiffer, 2021. "Predicting replicability—Analysis of survey and prediction market data from large-scale forecasting projects," PLOS ONE, Public Library of Science, vol. 16(4), pages 1-14, April.
  • Handle: RePEc:plo:pone00:0248780
    DOI: 10.1371/journal.pone.0248780
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0248780
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0248780&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0248780?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    2. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    3. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Bull, Charles & Courty, Pascal & Doyon, Maurice & Rondeau, Daniel, 2019. "Failure of the Becker–DeGroot–Marschak mechanism in inexperienced subjects: New tests of the game form misconception hypothesis," Journal of Economic Behavior & Organization, Elsevier, vol. 159(C), pages 235-253.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. Holger Herz & Dmitry Taubinsky, 2018. "What Makes a Price Fair? An Experimental Study of Transaction Experience and Endogenous Fairness Views," Journal of the European Economic Association, European Economic Association, vol. 16(2), pages 316-352.
    8. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    9. Brandts, Jordi & Riedl, Arno, 2020. "Market interaction and efficient cooperation," European Economic Review, Elsevier, vol. 121(C).
    10. Sangsuk Yoon & Nathan M. Fong & Angelika Dimoka, 2019. "The robustness of anchoring effects on preferential judgments," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 14(4), pages 470-487, July.
    11. Soo Hong Chew & Junjian Yi & Junsen Zhang & Songfa Zhong, 2018. "Risk Aversion and Son Preference: Experimental Evidence from Chinese Twin Parents," Management Science, INFORMS, vol. 64(8), pages 3896-3910, August.
    12. Williams, Cole Randall, 2019. "How redefining statistical significance can worsen the replication crisis," Economics Letters, Elsevier, vol. 181(C), pages 65-69.
    13. Despoina Alempaki & Emina Canic & Timothy L. Mullett & William J. Skylark & Chris Starmer & Neil Stewart & Fabio Tufano, 2019. "Reexamining How Utility and Weighting Functions Get Their Shapes: A Quasi-Adversarial Collaboration Providing a New Interpretation," Management Science, INFORMS, vol. 65(10), pages 4841-4862, October.
    14. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    15. Strømland, Eirik & Torsvik, Gaute, 2019. "Intuitive Prosociality: Heterogeneous Treatment Effects or False Positive?," OSF Preprints hrx2y, Center for Open Science.
    16. Buffat, Justin & Praxmarer, Matthias & Sutter, Matthias, 2023. "The intrinsic value of decision rights: A replication and an extension to team decision making," Journal of Economic Behavior & Organization, Elsevier, vol. 209(C), pages 560-571.
    17. repec:cup:judgdm:v:14:y:2019:i:4:p:470-487 is not listed on IDEAS
    18. Grabiszewski, Konrad & Horenstein, Alex, 2020. "Effort is not a monotonic function of skills: Results from a global mobile experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 176(C), pages 634-652.
    19. Christoph Merkle & Jan Müller-Dethard & Martin Weber, 2021. "Closing a mental account: the realization effect for gains and losses," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 303-329, March.
    20. Gächter, Simon & Kölle, Felix & Quercia, Simone, 2022. "Preferences and perceptions in Provision and Maintenance public goods," Games and Economic Behavior, Elsevier, vol. 135(C), pages 338-355.
    21. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0248780. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.