IDEAS home Printed from
MyIDEAS: Log in (now much improved!) to save this paper

Defining and characterising structural uncertainty in decision analytic models

Listed author(s):
  • Laura Bojke


    (Centre for Health Economics, University of York)

  • Karl Claxton


    (Centre for Health Economics, University of York)

  • Stephen Palmer


    (Centre for Health Economics, University of York)

  • Mark Sculpher


    (Centre for Health Economics, University of York)

An inappropriate structure for a decision analytic model can potentially invalidate estimates of cost-effectiveness and estimates of the value of further research. However, there are often a number of alternative and credible structural assumptions which can be made. Although it is common practice to acknowledge potential limitations in model structure, there is a lack of clarity about methods to characterize the uncertainty surrounding alternative structural assumptions and their contribution to decision uncertainty. A review of decision models commissioned by the NHS Health Technology Programme was undertaken to identify the types of model uncertainties described in the literature. A second review was undertaken to identify approaches to characterise these uncertainties. The assessment of structural uncertainty has received little attention in the health economics literature. A common method to characterise structural uncertainty is to compute results for each alternative model specification, and to present alternative results as scenario analyses. It is then left to decision maker to assess the credibility of the alternative structures in interpreting the range of results. The review of methods to explicitly characterise structural uncertainty identified two methods: 1) model averaging, where alternative models, with different specifications, are built, and their results averaged, using explicit prior distributions often based on expert opinion and 2) Model selection on the basis of prediction performance or goodness of fit. For a number of reasons these methods are neither appropriate nor desirable methods to characterize structural uncertainty in decision analytic models. When faced with a choice between multiple models, another method can be employed which allows structural uncertainty to be explicitly considered and does not ignore potentially relevant model structures. Uncertainty can be directly characterised (or parameterised) in the model itself. This method is analogous to model averaging on individual or sets of model inputs, but also allows the value of information associated with structural uncertainties to be resolved.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL:
File Function: First version, 2006
Download Restriction: no

Paper provided by Centre for Health Economics, University of York in its series Working Papers with number 009cherp.

in new window

Length: 25 pages
Date of creation: Mar 2006
Handle: RePEc:chy:respap:9cherp
Contact details of provider: Postal:
York Y010 5DD

Phone: (01904) 321401
Web page:

More information through EDIRC

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

in new window

  1. Zhao, Xiande & Xie, Jinxing & Leung, Janny, 2002. "The impact of forecasting model selection on the value of information sharing in a supply chain," European Journal of Operational Research, Elsevier, vol. 142(2), pages 321-344, October.
  2. Claxton, Karl, 1999. "The irrelevance of inference: a decision-making approach to the stochastic evaluation of health care technologies," Journal of Health Economics, Elsevier, vol. 18(3), pages 341-364, June.
  3. van Noortwijk, Jan M. & Cooke, Roger M. & Kok, Matthijs, 1995. "A Bayesian failure model based on isotropic deterioration," European Journal of Operational Research, Elsevier, vol. 82(2), pages 270-282, April.
  4. Xiaotong Shen & Hsin-Cheng Huang & Jimmy Ye, 2004. "Inference After Model Selection," Journal of the American Statistical Association, American Statistical Association, vol. 99, pages 751-762, January.
  5. Granger, C.W.J. & Pesaran, M. H., 1999. "Economic and Statistical Measures of Forecast Accuracy," Cambridge Working Papers in Economics 9910, Faculty of Economics, University of Cambridge.
  6. Karl Claxton & Mark Sculpher & Chris McCabe & Andrew Briggs & Ron Akehurst & Martin Buxton & John Brazier & Tony O'Hagan, 2005. "Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra," Health Economics, John Wiley & Sons, Ltd., vol. 14(4), pages 339-347.
Full references (including those not matched with items on IDEAS)

This item is featured on the following reading lists or Wikipedia pages:

  1. Technology Assessment

When requesting a correction, please mention this item's handle: RePEc:chy:respap:9cherp. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Frances Sharp)

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.