IDEAS home Printed from https://ideas.repec.org/a/eee/jomega/v81y2018icp17-25.html
   My bibliography  Save this article

When should we use simple decision models? A synthesis of various research strands

Author

Listed:
  • Katsikopoulos, Konstantinos V.
  • Durbach, Ian N.
  • Stewart, Theodor J.

Abstract

Many decisions can be analyzed and supported by quantitative models. These models tend to be complex psychologically in that they require the elicitation and combination of quantities such as probabilities, utilities, and weights. They may be simplified so that they become more transparent, and lead to increased trust, reflection, and insight. These potential benefits of simplicity should be weighed against its potential costs, notably possible decreases in performance. We review and synthesize research that has used mathematical analyses and computer simulations to investigate if and when simple models perform worse, equal, or better than more complex models. Various research strands have pursued this, but have not reached the same conclusions: Work on frequently repeated decisions as in inference and forecasting—which typically are operational and involve one or a few decision makers—has put forth conditions under which simple models are more accurate than more complex ones, and some researchers have proposed that simple models should be preferred. On the other hand, work on more or less one-off decisions as in preference and multi-criteria analysis—which typically are strategic and involve group decision making and multiple stakeholders—has concluded that simple models can at best approximate satisfactorily the more complex models. We show how these conclusions can be reconciled. Additionally, we discuss the theory available for explaining the relative performance of simple and more complex models. Finally, we present an aid to help determine if a simple model should be used, or not, for a particular type of decision problem.

Suggested Citation

  • Katsikopoulos, Konstantinos V. & Durbach, Ian N. & Stewart, Theodor J., 2018. "When should we use simple decision models? A synthesis of various research strands," Omega, Elsevier, vol. 81(C), pages 17-25.
  • Handle: RePEc:eee:jomega:v:81:y:2018:i:c:p:17-25
    DOI: 10.1016/j.omega.2017.09.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0305048317302566
    Download Restriction: Full text for ScienceDirect subscribers only

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Han Bleichrodt & Ulrich Schmidt & Horst Zank, 2009. "Additive Utility in Prospect Theory," Management Science, INFORMS, vol. 55(5), pages 863-873, May.
    2. Makridakis, Spyros & Hibon, Michele, 2000. "The M3-Competition: results, conclusions and implications," International Journal of Forecasting, Elsevier, vol. 16(4), pages 451-476.
    3. Peng, Bo & Song, Haiyan & Crouch, Geoffrey I., 2014. "A meta-analysis of international tourism demand forecasting and implications for practice," Tourism Management, Elsevier, vol. 45(C), pages 181-193.
    4. Aikman, David & Galesic, Mirta & Gigerenzer, Gerd & Kapadia, Sujit & Katsikopoulos, Konstantinos & Kothiyal, Amit & Murphy, Emma & Neumann, Tobias, 2014. "Financial Stability Paper No 28: Taking uncertainty seriously - simplicity versus complexity in financial regulation," Bank of England Financial Stability Papers 28, Bank of England.
    5. Robin M. Hogarth & Natalia Karelaia, 2005. "Simple Models for Multiattribute Choice with Many Alternatives: When It Does and Does Not Pay to Face Trade-offs with Binary Attributes," Management Science, INFORMS, vol. 51(12), pages 1860-1872, December.
    6. Dimitris Bertsimas & Melvyn Sim, 2004. "The Price of Robustness," Operations Research, INFORMS, vol. 52(1), pages 35-53, February.
    7. A A Syntetos & J E Boylan & J D Croston, 2005. "On the categorization of demand patterns," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 56(5), pages 495-503, May.
    8. Ghandar, Adam & Michalewicz, Zbigniew & Zurbruegg, Ralf, 2016. "The relationship between model complexity and forecasting performance for computer intelligence optimization in finance," International Journal of Forecasting, Elsevier, vol. 32(3), pages 598-613.
    9. Manel Baucells & Juan A. Carrasco & Robin M. Hogarth, 2008. "Cumulative Dominance and Heuristic Performance in Binary Multiattribute Choice," Operations Research, INFORMS, vol. 56(5), pages 1289-1304, October.
    10. Armstrong, J. Scott & Green, Kesten C. & Graefe, Andreas, 2015. "Golden rule of forecasting: Be conservative," Journal of Business Research, Elsevier, vol. 68(8), pages 1717-1731.
    11. Durbach, Ian N. & Calder, Jon M., 2016. "Modelling uncertainty in stochastic multicriteria acceptability analysis," Omega, Elsevier, vol. 64(C), pages 13-23.
    12. Mohammed Abdellaoui & Han Bleichrodt & Corina Paraschiv, 2007. "Loss Aversion Under Prospect Theory: A Parameter-Free Measurement," Management Science, INFORMS, vol. 53(10), pages 1659-1674, October.
    13. Luc Laeven & Fabian Valencia, 2010. "Resolution of Banking Crises; The Good, the Bad, and the Ugly," IMF Working Papers 10/146, International Monetary Fund.
    14. Hyndman, Rob J. & Koehler, Anne B. & Snyder, Ralph D. & Grose, Simone, 2002. "A state space framework for automatic forecasting using exponential smoothing methods," International Journal of Forecasting, Elsevier, vol. 18(3), pages 439-454.
    15. Tversky, Amos & Kahneman, Daniel, 1992. "Advances in Prospect Theory: Cumulative Representation of Uncertainty," Journal of Risk and Uncertainty, Springer, vol. 5(4), pages 297-323, October.
    16. Konstantinos Katsikopoulos & Aris Syntetos, 2016. "Bias-Variance Trade-offs in Demand Forecasting," Foresight: The International Journal of Applied Forecasting, International Institute of Forecasters, issue 40, pages 12-19, Winter.
    17. Fildes, Robert & Petropoulos, Fotios, 2015. "Is there a Golden Rule?," Journal of Business Research, Elsevier, vol. 68(8), pages 1742-1745.
    18. Risto Lahdelma & Pekka Salminen, 2001. "SMAA-2: Stochastic Multicriteria Acceptability Analysis for Group Decision Making," Operations Research, INFORMS, vol. 49(3), pages 444-454, June.
    19. Aikman, David & Galesic, Mirta & Gigerenzer, Gerd & Kapadia, Sujit & Katsikopolous, Konstantinos & Kothiyal, Amit & Murphy, Emma & Neumann, Tobias, 2014. "Taking Uncertainty Seriously: Simplicity versus Complexity in Financial Regulation," MPRA Paper 59908, University Library of Munich, Germany.
    20. Han Bleichrodt & Jose Luis Pinto & Peter P. Wakker, 2001. "Making Descriptive Use of Prospect Theory to Improve the Prescriptive Use of Expected Utility," Management Science, INFORMS, vol. 47(11), pages 1498-1514, November.
    21. Ahn, Byeong Seok, 2011. "Compatible weighting method with rank order centroid: Maximum entropy ordered weighted averaging approach," European Journal of Operational Research, Elsevier, vol. 212(3), pages 552-559, August.
    22. Kirkwood, Craig W. & Corner, James L., 1993. "The Effectiveness of Partial Information about Attribute Weights for Ranking Alternatives in Multiattribute Decision Making," Organizational Behavior and Human Decision Processes, Elsevier, vol. 54(3), pages 456-476, April.
    23. Meade, Nigel & Islam, Towhidul, 2015. "Forecasting in telecommunications and ICT—A review," International Journal of Forecasting, Elsevier, vol. 31(4), pages 1105-1126.
    24. Makridakis, Spyros & Hogarth, Robin M. & Gaba, Anil, 2009. "Forecasting and uncertainty in the economic and business world," International Journal of Forecasting, Elsevier, vol. 25(4), pages 794-812, October.
    25. Foley, Aoife M. & Leahy, Paul G. & Marvuglia, Antonino & McKeogh, Eamon J., 2012. "Current methods and advances in forecasting of wind power generation," Renewable Energy, Elsevier, vol. 37(1), pages 1-8.
    26. S French, 2013. "Cynefin, statistics and decision analysis," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 64(4), pages 547-561, April.
    27. Laura Martignon & Ulrich Hoffrage, 2002. "Fast, frugal, and fit: Simple heuristics for paired comparison," Theory and Decision, Springer, vol. 52(1), pages 29-71, February.
    28. Durbach, Ian N. & Stewart, Theodor J., 2012. "A comparison of simplified value function approaches for treating uncertainty in multi-criteria decision analysis," Omega, Elsevier, vol. 40(4), pages 456-464.
    29. Nikolopoulos, K. & Goodwin, P. & Patelis, A. & Assimakopoulos, V., 2007. "Forecasting with cue information: A comparison of multiple regression with alternative forecasting approaches," European Journal of Operational Research, Elsevier, vol. 180(1), pages 354-368, July.
    30. Rezaei, Jafar, 2016. "Best-worst multi-criteria decision-making method: Some properties and a linear model," Omega, Elsevier, vol. 64(C), pages 126-130.
    31. Yoram Wind & Thomas L. Saaty, 1980. "Marketing Applications of the Analytic Hierarchy Process," Management Science, INFORMS, vol. 26(7), pages 641-658, July.
    32. Todd, Peter M., 2007. "How much information do we need?," European Journal of Operational Research, Elsevier, vol. 177(3), pages 1317-1332, March.
    33. Corrente, Salvatore & Figueira, José Rui & Greco, Salvatore, 2014. "The SMAA-PROMETHEE method," European Journal of Operational Research, Elsevier, vol. 239(2), pages 514-522.
    34. Durbach, Ian N. & Stewart, Theodor J., 2009. "Using expected values to simplify decision making under uncertainty," Omega, Elsevier, vol. 37(2), pages 312-330, April.
    35. Paul Goodwin, 2011. "High on Complexity, Low on Evidence: Are Advanced Forecasting Methods Always as Good as They Seem?," Foresight: The International Journal of Applied Forecasting, International Institute of Forecasters, issue 23, pages 10-12, Fall.
    36. Stephan Kolassa, 2016. "Sometimes It's Better to Be Simple than Correct," Foresight: The International Journal of Applied Forecasting, International Institute of Forecasters, issue 40, pages 20-26, Winter.
    37. Durbach, Ian N., 2014. "Outranking under uncertainty using scenarios," European Journal of Operational Research, Elsevier, vol. 232(1), pages 98-108.
    38. Peter S. Fader & Bruce G. S. Hardie & Chun-Yao Huang, 2004. "A Dynamic Changepoint Model for New Product Sales Forecasting," Marketing Science, INFORMS, vol. 23(1), pages 50-65, October.
    39. Konstantinos V. Katsikopoulos, 2013. "Why Do Simple Heuristics Perform Well in Choices with Binary Attributes?," Decision Analysis, INFORMS, vol. 10(4), pages 327-340, December.
    40. James E. Smith & Detlof von Winterfeldt, 2004. "Anniversary Article: Decision Analysis in Management Science," Management Science, INFORMS, vol. 50(5), pages 561-574, May.
    41. Craig W. Kirkwood, 1992. "Estimating the Impact of Uncertainty on a Deterministic Multiattribute Evaluation," Management Science, INFORMS, vol. 38(6), pages 819-826, June.
    42. Konstantinos V. Katsikopoulos, 2011. "Psychological Heuristics for Making Inferences: Definition, Performance, and the Emerging Theory and Practice," Decision Analysis, INFORMS, vol. 8(1), pages 10-29, March.
    43. Graefe, Andreas, 2015. "Improving forecasts using equally weighted predictors," Journal of Business Research, Elsevier, vol. 68(8), pages 1792-1799.
    44. Katsikopoulos, Konstantinos V., 2016. "On the role of psychological heuristics in operational research; and a demonstration in military stability operationsAuthor-Name: Keller, Niklas," European Journal of Operational Research, Elsevier, vol. 249(3), pages 1063-1073.
    45. F. Hutton Barron & Bruce E. Barrett, 1996. "Decision Quality Using Ranked Attribute Weights," Management Science, INFORMS, vol. 42(11), pages 1515-1523, November.
    46. Mohammed Abdellaoui, 2000. "Parameter-Free Elicitation of Utility and Probability Weighting Functions," Management Science, INFORMS, vol. 46(11), pages 1497-1512, November.
    47. Green, Kesten C. & Armstrong, J. Scott, 2015. "Simple versus complex forecasting: The evidence," Journal of Business Research, Elsevier, vol. 68(8), pages 1678-1685.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cinelli, Marco & Kadziński, Miłosz & Gonzalez, Michael & Słowiński, Roman, 2020. "How to support the application of multiple criteria decision analysis? Let us start with a comprehensive taxonomy," Omega, Elsevier, vol. 96(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jomega:v:81:y:2018:i:c:p:17-25. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Haili He). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/375/description#description .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.