IDEAS home Printed from https://ideas.repec.org/a/wly/riskan/v22y2002i3p553-578.html
   My bibliography  Save this article

Identification and Review of Sensitivity Analysis Methods

Author

Listed:
  • H. Christopher Frey
  • Sumeet R. Patil

Abstract

Identification and qualitative comparison of sensitivity analysis methods that have been used across various disciplines, and that merit consideration for application to food‐safety risk assessment models, are presented in this article. Sensitivity analysis can help in identifying critical control points, prioritizing additional data collection or research, and verifying and validating a model. Ten sensitivity analysis methods, including four mathematical methods, five statistical methods, and one graphical method, are identified. The selected methods are compared on the basis of their applicability to different types of models, computational issues such as initial data requirement and complexity of their application, representation of the sensitivity, and the specific uses of these methods. Applications of these methods are illustrated with examples from various fields. No one method is clearly best for food‐safety risk models. In general, use of two or more methods, preferably with dissimilar theoretical foundations, may be needed to increase confidence in the ranking of key inputs.

Suggested Citation

  • H. Christopher Frey & Sumeet R. Patil, 2002. "Identification and Review of Sensitivity Analysis Methods," Risk Analysis, John Wiley & Sons, vol. 22(3), pages 553-578, June.
  • Handle: RePEc:wly:riskan:v:22:y:2002:i:3:p:553-578
    DOI: 10.1111/0272-4332.00039
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/0272-4332.00039
    Download Restriction: no

    File URL: https://libkey.io/10.1111/0272-4332.00039?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Kleijnen, Jack P. C., 1995. "Verification and validation of simulation models," European Journal of Operational Research, Elsevier, vol. 82(1), pages 145-162, April.
    2. Fraedrich, D. & Goldberg, A., 2000. "A methodological framework for the validation of predictive simulations," European Journal of Operational Research, Elsevier, vol. 124(1), pages 55-62, July.
    3. Harry M. Marks & Margaret E. Coleman & C.‐T. Jordan Lin & Tanya Roberts, 1998. "Topics in Microbial Risk Assessment: Dynamic Flow Tree Process," Risk Analysis, John Wiley & Sons, vol. 18(3), pages 309-328, June.
    4. Gregory C. Critchfield & Keith E. Willard, 1986. "Probabilistic Analysis of Decision Trees Using Monte Carlo Simulation," Medical Decision Making, , vol. 6(2), pages 85-92, June.
    5. Herbert A. Simon, 1996. "The Sciences of the Artificial, 3rd Edition," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262691914, December.
    6. Jon F. Merz & Mitchell J. Small & Paul S. Fischbeck, 1992. "Measuring Decision Sensitivity," Medical Decision Making, , vol. 12(3), pages 189-196, August.
    7. Kleijnen, Jack P. C. & Sargent, Robert G., 2000. "A methodology for fitting and validating metamodels in simulation," European Journal of Operational Research, Elsevier, vol. 120(1), pages 14-29, January.
    8. Saltelli, Andrea & Bolado, Ricardo, 1998. "An alternative way to compute Fourier amplitude sensitivity test (FAST)," Computational Statistics & Data Analysis, Elsevier, vol. 26(4), pages 445-460, February.
    9. Julia J. Pet‐Armacost & Jose Sepulveda & Milton Sakude, 1999. "Monte Carlo Sensitivity Analysis of Unknown Parameters in Hazardous Materials Transportation Risk Assessment," Risk Analysis, John Wiley & Sons, vol. 19(6), pages 1173-1184, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xuefei Lu & Alessandro Rudi & Emanuele Borgonovo & Lorenzo Rosasco, 2020. "Faster Kriging: Facing High-Dimensional Simulators," Operations Research, INFORMS, vol. 68(1), pages 233-249, January.
    2. Tunali, S. & Batmaz, I., 2003. "A metamodeling methodology involving both qualitative and quantitative input factors," European Journal of Operational Research, Elsevier, vol. 150(2), pages 437-450, October.
    3. Strang, Kenneth David, 2012. "Importance of verifying queue model assumptions before planning with simulation software," European Journal of Operational Research, Elsevier, vol. 218(2), pages 493-504.
    4. Stinstra, E., 2006. "The meta-model approach for simulation-based design optimization," Other publications TiSEM 713f828a-4716-4a19-af00-e, Tilburg University, School of Economics and Management.
    5. Ekren, Orhan & Ekren, Banu Yetkin, 2008. "Size optimization of a PV/wind hybrid energy conversion system with battery storage using response surface methodology," Applied Energy, Elsevier, vol. 85(11), pages 1086-1101, November.
    6. Clazien J. De Vos & Helmut W. Saatkamp & Mirjam Nielen & Ruud B. M. Huirne, 2006. "Sensitivity Analysis to Evaluate the Impact of Uncertain Factors in a Scenario Tree Model for Classical Swine Fever Introduction," Risk Analysis, John Wiley & Sons, vol. 26(5), pages 1311-1322, October.
    7. Batmaz, Inci & Tunali, Semra, 2003. "Small response surface designs for metamodel estimation," European Journal of Operational Research, Elsevier, vol. 145(2), pages 455-470, March.
    8. Ekren, Orhan & Ekren, Banu Y. & Ozerdem, Baris, 2009. "Break-even analysis and size optimization of a PV/wind hybrid energy conversion system with battery storage - A case study," Applied Energy, Elsevier, vol. 86(7-8), pages 1043-1054, July.
    9. Kleijnen, J.P.C., 1997. "Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models," Discussion Paper 1997-52, Tilburg University, Center for Economic Research.
    10. Noguera, Jose H. & Watson, Edward F., 2006. "Response surface analysis of a multi-product batch processing facility using a simulation metamodel," International Journal of Production Economics, Elsevier, vol. 102(2), pages 333-343, August.
    11. Alan Hevner & Isabelle Comyn-Wattiau & Jacky Akoka & Nicolas Prat, 2018. "A pragmatic approach for identifying and managing design science research goals and evaluation criteria," Post-Print hal-02283783, HAL.
    12. Tobias Knabke & Sebastian Olbrich, 2018. "Building novel capabilities to enable business intelligence agility: results from a quantitative study," Information Systems and e-Business Management, Springer, vol. 16(3), pages 493-546, August.
    13. A. E. Ades & Karl Claxton & Mark Sculpher, 2006. "Evidence synthesis, parameter correlation and probabilistic sensitivity analysis," Health Economics, John Wiley & Sons, Ltd., vol. 15(4), pages 373-381, April.
    14. Katarzyna Growiec & Jakub Growiec & Bogumil Kaminski, 2017. "Social Network Structure and The Trade-Off Between Social Utility and Economic Performance," KAE Working Papers 2017-026, Warsaw School of Economics, Collegium of Economic Analysis.
    15. Sunder Shyam, 2011. "Imagined Worlds of Accounting," Accounting, Economics, and Law: A Convivium, De Gruyter, vol. 1(1), pages 1-14, January.
    16. Kleijnen, Jack P. C., 2005. "An overview of the design and analysis of simulation experiments for sensitivity analysis," European Journal of Operational Research, Elsevier, vol. 164(2), pages 287-300, July.
    17. Fiori Stefano, 2005. "The emergence of instructions : some open problems in Hayek's theory," CESMEP Working Papers 200504, University of Turin.
    18. Plischke, Elmar & Borgonovo, Emanuele, 2019. "Copula theory and probabilistic sensitivity analysis: Is there a connection?," European Journal of Operational Research, Elsevier, vol. 277(3), pages 1046-1059.
    19. McCown, R. L., 2002. "Changing systems for supporting farmers' decisions: problems, paradigms, and prospects," Agricultural Systems, Elsevier, vol. 74(1), pages 179-220, October.
    20. Jin P. Gerlach & Ronald T. Cenfetelli, 2022. "Overcoming the Single-IS Paradigm in Individual-Level IS Research," Information Systems Research, INFORMS, vol. 33(2), pages 476-488, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:riskan:v:22:y:2002:i:3:p:553-578. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1111/(ISSN)1539-6924 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.