IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v184y2021i3p920-940.html
   My bibliography  Save this article

A comparison of prior elicitation aggregation using the classical method and SHELF

Author

Listed:
  • Cameron J. Williams
  • Kevin J. Wilson
  • Nina Wilson

Abstract

Subjective Bayesian prior distributions elicited from experts can be aggregated together to form group priors. This paper compares aggregated priors formed by equal weight aggregation, the classical method and the Sheffield elicitation framework to each other and individual expert priors, using an expert elicitation carried out for a clinical trial. Aggregation methods and individual expert prior distributions are compared using proper scoring rules to compare the informativeness and calibration of the distributions. The three aggregation methods outperform the individual experts, and the Sheffield elicitation framework performs best among them.

Suggested Citation

  • Cameron J. Williams & Kevin J. Wilson & Nina Wilson, 2021. "A comparison of prior elicitation aggregation using the classical method and SHELF," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 920-940, July.
  • Handle: RePEc:bla:jorssa:v:184:y:2021:i:3:p:920-940
    DOI: 10.1111/rssa.12691
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12691
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12691?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. James E. Matheson & Robert L. Winkler, 1976. "Scoring Rules for Continuous Probability Distributions," Management Science, INFORMS, vol. 22(10), pages 1087-1096, June.
    2. Flandoli, F. & Giorgi, E. & Aspinall, W.P. & Neri, A., 2011. "Comparison of a new expert elicitation model with the Classical Model, equal weights and single experts, using a cross-validation technique," Reliability Engineering and System Safety, Elsevier, vol. 96(10), pages 1292-1310.
    3. John Quigley & Abigail Colson & Willy Aspinall & Roger M. Cooke, 2018. "Elicitation in the Classical Model," International Series in Operations Research & Management Science, in: Luis C. Dias & Alec Morton & John Quigley (ed.), Elicitation, chapter 0, pages 15-36, Springer.
    4. Alessandra Babuscia & Kar-Ming Cheung, 2014. "An approach to perform expert elicitation for engineering design risk analysis: methodology and experimental results," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 177(2), pages 475-497, February.
    5. T. Ganguly & K. J. Wilson & J. Quigley & R. M. Cooke & Alessandra Babuscia & Kar-Ming Cheung, 2014. "Reaction to ‘An approach to perform expert elicitation for engineering design risk analysis: methodology and experimental results’," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 177(4), pages 981-985, October.
    6. Fergus Bolger & Gene Rowe, 2015. "The Aggregation of Expert Judgment: Do Good Things Come to Those Who Weight?," Risk Analysis, John Wiley & Sons, vol. 35(1), pages 5-11, January.
    7. John Paul Gosling, 2018. "SHELF: The Sheffield Elicitation Framework," International Series in Operations Research & Management Science, in: Luis C. Dias & Alec Morton & John Quigley (ed.), Elicitation, chapter 0, pages 61-93, Springer.
    8. R. Winkler & Javier Muñoz & José Cervera & José Bernardo & Gail Blattenberger & Joseph Kadane & Dennis Lindley & Allan Murphy & Robert Oliver & David Ríos-Insua, 1996. "Scoring rules and the evaluation of probabilities," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 5(1), pages 1-60, June.
    9. Ine H. J. Van Der Fels‐Klerx & Louis H. J. Goossens & Helmut W. Saatkamp & Suzan H. S. Horst, 2002. "Elicitation of Quantitative Data from a Heterogeneous Expert Panel: Formal Process and Application in Animal Health," Risk Analysis, John Wiley & Sons, vol. 22(1), pages 67-81, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Julia R. Falconer & Eibe Frank & Devon L. L. Polaschek & Chaitanya Joshi, 2022. "Methods for Eliciting Informative Prior Distributions: A Critical Review," Decision Analysis, INFORMS, vol. 19(3), pages 189-204, September.
    2. Williamson, S. Faye & Jacko, Peter & Jaki, Thomas, 2022. "Generalisations of a Bayesian decision-theoretic randomisation procedure and the impact of delayed responses," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Carless, Travis S. & Redus, Kenneth & Dryden, Rachel, 2021. "Estimating nuclear proliferation and security risks in emerging markets using Bayesian Belief Networks," Energy Policy, Elsevier, vol. 159(C).
    2. Victor Jose, 2009. "A Characterization for the Spherical Scoring Rule," Theory and Decision, Springer, vol. 66(3), pages 263-281, March.
    3. Wilson, Kevin J., 2017. "An investigation of dependence in expert judgement studies with multiple experts," International Journal of Forecasting, Elsevier, vol. 33(1), pages 325-336.
    4. Tilmann Gneiting & Larissa Stanberry & Eric Grimit & Leonhard Held & Nicholas Johnson, 2008. "Assessing probabilistic forecasts of multivariate quantities, with an application to ensemble predictions of surface winds," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(2), pages 211-235, August.
    5. Michaël Zamo & Liliane Bel & Olivier Mestre, 2021. "Sequential aggregation of probabilistic forecasts—Application to wind speed ensemble forecasts," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(1), pages 202-225, January.
    6. Yael Grushka-Cockayne & Kenneth C. Lichtendahl Jr. & Victor Richmond R. Jose & Robert L. Winkler, 2017. "Quantile Evaluation, Sensitivity to Bracketing, and Sharing Business Payoffs," Operations Research, INFORMS, vol. 65(3), pages 712-728, June.
    7. Karl Schlag & James Tremewan & Joël Weele, 2015. "A penny for your thoughts: a survey of methods for eliciting beliefs," Experimental Economics, Springer;Economic Science Association, vol. 18(3), pages 457-490, September.
    8. Christoph Werner & Tim Bedford & John Quigley, 2018. "Sequential Refined Partitioning for Probabilistic Dependence Assessment," Risk Analysis, John Wiley & Sons, vol. 38(12), pages 2683-2702, December.
    9. Eggstaff, Justin W. & Mazzuchi, Thomas A. & Sarkani, Shahram, 2014. "The effect of the number of seed variables on the performance of Cooke′s classical model," Reliability Engineering and System Safety, Elsevier, vol. 121(C), pages 72-82.
    10. Gneiting, Tilmann, 2011. "Quantiles as optimal point forecasts," International Journal of Forecasting, Elsevier, vol. 27(2), pages 197-207, April.
    11. Makariou, Despoina & Barrieu, Pauline & Tzougas, George, 2021. "A finite mixture modelling perspective for combining experts’ opinions with an application to quantile-based risk measures," LSE Research Online Documents on Economics 110763, London School of Economics and Political Science, LSE Library.
    12. Carol Alexander & Michael Coulon & Yang Han & Xiaochun Meng, 2021. "Evaluating the Discrimination Ability of Proper Multivariate Scoring Rules," Papers 2101.12693, arXiv.org.
    13. D. Johnstone, 2007. "The Value of a Probability Forecast from Portfolio Theory," Theory and Decision, Springer, vol. 63(2), pages 153-203, September.
    14. Gneiting, Tilmann, 2011. "Quantiles as optimal point forecasts," International Journal of Forecasting, Elsevier, vol. 27(2), pages 197-207.
    15. Karl Schlag & James Tremewan & Joël Weele, 2015. "A penny for your thoughts: a survey of methods for eliciting beliefs," Experimental Economics, Springer;Economic Science Association, vol. 18(3), pages 457-490, September.
    16. Victor Richmond R. Jose & Robert F. Nau & Robert L. Winkler, 2009. "Sensitivity to Distance and Baseline Distributions in Forecast Evaluation," Management Science, INFORMS, vol. 55(4), pages 582-590, April.
    17. Robert L. Winkler & Yael Grushka-Cockayne & Kenneth C. Lichtendahl Jr. & Victor Richmond R. Jose, 2019. "Probability Forecasts and Their Combination: A Research Perspective," Decision Analysis, INFORMS, vol. 16(4), pages 239-260, December.
    18. Victor Richmond R. Jose & Robert L. Winkler, 2009. "Evaluating Quantile Assessments," Operations Research, INFORMS, vol. 57(5), pages 1287-1297, October.
    19. Alexander, Carol & Han, Yang & Meng, Xiaochun, 2023. "Static and dynamic models for multivariate distribution forecasts: Proper scoring rule tests of factor-quantile versus multivariate GARCH models," International Journal of Forecasting, Elsevier, vol. 39(3), pages 1078-1096.
    20. Kenneth C. Lichtendahl & Yael Grushka-Cockayne & Robert L. Winkler, 2013. "Is It Better to Average Probabilities or Quantiles?," Management Science, INFORMS, vol. 59(7), pages 1594-1611, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:184:y:2021:i:3:p:920-940. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.