IDEAS home Printed from https://ideas.repec.org/a/wly/riskan/v35y2015i1p5-11.html
   My bibliography  Save this article

The Aggregation of Expert Judgment: Do Good Things Come to Those Who Weight?

Author

Listed:
  • Fergus Bolger
  • Gene Rowe

Abstract

Good policy making should be based on available scientific knowledge. Sometimes this knowledge is well established through research, but often scientists must simply express their judgment, and this is particularly so in risk scenarios that are characterized by high levels of uncertainty. Usually in such cases, the opinions of several experts will be sought in order to pool knowledge and reduce error, raising the question of whether individual expert judgments should be given different weights. We argue—against the commonly advocated “classical method”—that no significant benefits are likely to accrue from unequal weighting in mathematical aggregation. Our argument hinges on the difficulty of constructing reliable and valid measures of substantive expertise upon which to base weights. Practical problems associated with attempts to evaluate experts are also addressed. While our discussion focuses on one specific weighting scheme that is currently gaining in popularity for expert knowledge elicitation, our general thesis applies to externally imposed unequal weighting schemes more generally.

Suggested Citation

  • Fergus Bolger & Gene Rowe, 2015. "The Aggregation of Expert Judgment: Do Good Things Come to Those Who Weight?," Risk Analysis, John Wiley & Sons, vol. 35(1), pages 5-11, January.
  • Handle: RePEc:wly:riskan:v:35:y:2015:i:1:p:5-11
    DOI: 10.1111/risa.12272
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/risa.12272
    Download Restriction: no

    File URL: https://libkey.io/10.1111/risa.12272?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Soyer, Emre & Hogarth, Robin M., 2012. "The illusion of predictability: How regression statistics mislead experts," International Journal of Forecasting, Elsevier, vol. 28(3), pages 695-711.
    2. Willy Aspinall, 2010. "A route to more tractable expert advice," Nature, Nature, vol. 463(7279), pages 294-295, January.
    3. Robert T. Clemen & Robert L. Winkler, 1999. "Combining Probability Distributions From Experts in Risk Analysis," Risk Analysis, John Wiley & Sons, vol. 19(2), pages 187-203, April.
    4. Allan H. Murphy & Robert L. Winkler, 1977. "Reliability of Subjective Probability Forecasts of Precipitation and Temperature," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 26(1), pages 41-47, March.
    5. Mark A Burgman & Marissa McBride & Raquel Ashton & Andrew Speirs-Bridge & Louisa Flander & Bonnie Wintle & Fiona Fidler & Libby Rumpff & Charles Twardy, 2011. "Expert Status and Performance," PLOS ONE, Public Library of Science, vol. 6(7), pages 1-7, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fergus Bolger & Gene Rowe, 2015. "There is Data, and then there is Data: Only Experimental Evidence will Determine the Utility of Differential Weighting of Expert Judgment," Risk Analysis, John Wiley & Sons, vol. 35(1), pages 21-26, January.
    2. Robert L. Winkler, 2015. "The Importance of Communicating Uncertainties in Forecasts: Overestimating the Risks from Winter Storm Juno," Risk Analysis, John Wiley & Sons, vol. 35(3), pages 349-353, March.
    3. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    4. Anil Gaba & Dana G. Popescu & Zhi Chen, 2019. "Assessing Uncertainty from Point Forecasts," Management Science, INFORMS, vol. 65(1), pages 90-106, January.
    5. Satopää, Ville A., 2021. "Improving the wisdom of crowds with analysis of variance of predictions of related outcomes," International Journal of Forecasting, Elsevier, vol. 37(4), pages 1728-1747.
    6. Anca M. Hanea & Marissa F. McBride & Mark A. Burgman & Bonnie C. Wintle, 2018. "The Value of Performance Weights and Discussion in Aggregated Expert Judgments," Risk Analysis, John Wiley & Sons, vol. 38(9), pages 1781-1794, September.
    7. Pita Spruijt & Anne B. Knol & Arthur C. Petersen & Erik Lebret, 2019. "Expert Views on Their Role as Policy Advisor: Pilot Study for the Cases of Electromagnetic Fields, Particulate Matter, and Antimicrobial Resistance," Risk Analysis, John Wiley & Sons, vol. 39(5), pages 968-974, May.
    8. Chen Li & Ning Liu, 2021. "What to tell? Wise communication and wise crowd," Theory and Decision, Springer, vol. 90(2), pages 279-299, March.
    9. Cameron J. Williams & Kevin J. Wilson & Nina Wilson, 2021. "A comparison of prior elicitation aggregation using the classical method and SHELF," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 920-940, July.
    10. Anne E. Smith, 2018. "Setting Air Quality Standards for PM2.5: A Role for Subjective Uncertainty in NAAQS Quantitative Risk Assessments?," Risk Analysis, John Wiley & Sons, vol. 38(11), pages 2318-2339, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Anca M. Hanea & Marissa F. McBride & Mark A. Burgman & Bonnie C. Wintle, 2018. "The Value of Performance Weights and Discussion in Aggregated Expert Judgments," Risk Analysis, John Wiley & Sons, vol. 38(9), pages 1781-1794, September.
    2. Bolger, Fergus & Wright, George, 2017. "Use of expert knowledge to anticipate the future: Issues, analysis and directions," International Journal of Forecasting, Elsevier, vol. 33(1), pages 230-243.
    3. Hanea, Anca & Wilkinson, David Peter & McBride, Marissa & Lyon, Aidan & van Ravenzwaaij, Don & Singleton Thorn, Felix & Gray, Charles T. & Mandel, David R. & Willcox, Aaron & Gould, Elliot, 2021. "Mathematically aggregating experts' predictions of possible futures," MetaArXiv rxmh7, Center for Open Science.
    4. Eric Libby & Leon Glass, 2010. "The Calculus of Committee Composition," PLOS ONE, Public Library of Science, vol. 5(9), pages 1-8, September.
    5. Julia R. Falconer & Eibe Frank & Devon L. L. Polaschek & Chaitanya Joshi, 2022. "Methods for Eliciting Informative Prior Distributions: A Critical Review," Decision Analysis, INFORMS, vol. 19(3), pages 189-204, September.
    6. David V. Budescu & Eva Chen, 2015. "Identifying Expertise to Extract the Wisdom of Crowds," Management Science, INFORMS, vol. 61(2), pages 267-280, February.
    7. Abigail R Colson & Roger M Cooke, 2018. "Expert Elicitation: Using the Classical Model to Validate Experts’ Judgments," Review of Environmental Economics and Policy, Association of Environmental and Resource Economists, vol. 12(1), pages 113-132.
    8. Christopher W. Karvetski & Kenneth C. Olson & David R. Mandel & Charles R. Twardy, 2013. "Probabilistic Coherence Weighting for Optimizing Expert Forecasts," Decision Analysis, INFORMS, vol. 10(4), pages 305-326, December.
    9. Brian H. MacGillivray, 2019. "Null Hypothesis Testing ≠ Scientific Inference: A Critique of the Shaky Premise at the Heart of the Science and Values Debate, and a Defense of Value‐Neutral Risk Assessment," Risk Analysis, John Wiley & Sons, vol. 39(7), pages 1520-1532, July.
    10. Meissner, Philip & Brands, Christian & Wulf, Torsten, 2017. "Quantifiying blind spots and weak signals in executive judgment: A structured integration of expert judgment into the scenario development process," International Journal of Forecasting, Elsevier, vol. 33(1), pages 244-253.
    11. Kenneth Gillingham & William D. Nordhaus & David Anthoff & Geoffrey Blanford & Valentina Bosetti & Peter Christensen & Haewon McJeon & John Reilly & Paul Sztorc, 2015. "Modeling Uncertainty in Climate Change: A Multi-Model Comparison," NBER Working Papers 21637, National Bureau of Economic Research, Inc.
    12. Armstrong, J. Scott & Green, Kesten C. & Graefe, Andreas, 2015. "Golden rule of forecasting: Be conservative," Journal of Business Research, Elsevier, vol. 68(8), pages 1717-1731.
    13. Avner Engel & Shalom Shachar, 2006. "Measuring and optimizing systems' quality costs and project duration," Systems Engineering, John Wiley & Sons, vol. 9(3), pages 259-280, September.
    14. Atanasov, Pavel & Witkowski, Jens & Ungar, Lyle & Mellers, Barbara & Tetlock, Philip, 2020. "Small steps to accuracy: Incremental belief updaters are better forecasters," Organizational Behavior and Human Decision Processes, Elsevier, vol. 160(C), pages 19-35.
    15. McKenzie, Craig R.M. & Liersch, Michael J. & Yaniv, Ilan, 2008. "Overconfidence in interval estimates: What does expertise buy you?," Organizational Behavior and Human Decision Processes, Elsevier, vol. 107(2), pages 179-191, November.
    16. Stapleton, L.M. & Hanna, P. & Ravenscroft, N. & Church, A., 2014. "A flexible ecosystem services proto-typology based on public opinion," Ecological Economics, Elsevier, vol. 106(C), pages 83-90.
    17. Franz Dietrich & Christian List, 2017. "Probabilistic opinion pooling generalized. Part one: general agendas," Social Choice and Welfare, Springer;The Society for Social Choice and Welfare, vol. 48(4), pages 747-786, April.
    18. K J Wilson & M Farrow, 2010. "Bayes linear kinematics in the analysis of failure rates and failure time distributions," Journal of Risk and Reliability, , vol. 224(4), pages 309-321, December.
    19. repec:cup:judgdm:v:13:y:2018:i:6:p:607-621 is not listed on IDEAS
    20. Patrick Afflerbach & Christopher Dun & Henner Gimpel & Dominik Parak & Johannes Seyfried, 2021. "A Simulation-Based Approach to Understanding the Wisdom of Crowds Phenomenon in Aggregating Expert Judgment," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 63(4), pages 329-348, August.
    21. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:riskan:v:35:y:2015:i:1:p:5-11. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1539-6924 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.