IDEAS home Printed from https://ideas.repec.org/a/sae/medema/v26y2006i1p48-56.html
   My bibliography  Save this article

Physician Evaluation after Medical Errors: Does Having a Computer Decision Aid Help or Hurt in Hindsight?

Author

Listed:
  • Mark V. Pezzo

    (140 7th Ave, South, DAV 258, University of South Florida, St. Petersburg, FL 33701; pezzo@stpt.usf.edu)

  • Stephanie P. Pezzo

    (College of Medicine, University of South Florida, Tampa, FL)

Abstract

Objective. The authors examined whether physicians’ use of computerized decision aids affects patient satisfaction and/or blame for medical outcomes. Method. Experiment 1: Fiftynine undergraduates read about a doctor who made either a correct or incorrect diagnosis and either used a decision aid or did not. All rated the quality of the doctor's decision and the likelihood of recommending the doctor. Those receiving a negative outcome also rated negligence and likelihood of suing. Experiment 2: One hundred sixty-six medical students and 154 undergraduates read negative-outcome scenarios in which a doctor either agreed with the aid, heeded the aid against his own opinion, defied the aid in favor of his own opinion, or did not use a decision aid. Subjects rated doctor fault and competence and the appropriateness of using decision aids in medicine. Medical students made judgments for themselves and for a layperson. Results. Experiment 1: Using a decision aid caused a positive outcome to be rated less positively and a negative outcome to be rated less negatively. Experiment 2: Agreeing with or heeding the aid was associated with reduced fault, whereas defying the aid was associated with roughly the same fault as not using one at all. Medical students were less harsh than undergraduates but accurately predicted undergraduate's responses. Conclusion. Agreeing with or heeding a decision aid, but not defying it, may reduce liability after an error. However, using an aid may reduce favorability after a positive outcome.

Suggested Citation

  • Mark V. Pezzo & Stephanie P. Pezzo, 2006. "Physician Evaluation after Medical Errors: Does Having a Computer Decision Aid Help or Hurt in Hindsight?," Medical Decision Making, , vol. 26(1), pages 48-56, January.
  • Handle: RePEc:sae:medema:v:26:y:2006:i:1:p:48-56
    DOI: 10.1177/0272989X05282644
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0272989X05282644
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0272989X05282644?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Arkes, Hal R. & Dawes, Robyn M. & Christensen, Caryn, 1986. "Factors influencing the use of a decision rule in a probabilistic task," Organizational Behavior and Human Decision Processes, Elsevier, vol. 37(1), pages 93-110, February.
    2. Ashton, Rh, 1990. "Pressure And Performance In Accounting Decision Settings - Paradoxical Effects Of Incentives, Feedback, And Justification," Journal of Accounting Research, Wiley Blackwell, vol. 28, pages 148-180.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nils Köbis & Jean-François Bonnefon & Iyad Rahwan, 2021. "Bad machines corrupt good morals," Nature Human Behaviour, Nature, vol. 5(6), pages 679-685, June.
    2. Chugunova, Marina & Sele, Daniela, 2022. "We and It: An interdisciplinary review of the experimental evidence on how humans interact with machines," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 99(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Christine R. Ohlert & Barbara E. Weißenberger, 2020. "Debiasing escalation of commitment: the effectiveness of decision aids to enhance de-escalation," Journal of Management Control: Zeitschrift für Planung und Unternehmenssteuerung, Springer, vol. 30(4), pages 405-438, February.
    2. Shepherd, Dean A. & Zacharakis, Andrew, 2002. "Venture capitalists' expertise: A call for research into decision aids and cognitive feedback," Journal of Business Venturing, Elsevier, vol. 17(1), pages 1-20, January.
    3. Dobbs, Ian M. & Miller, Anthony D., 2009. "Experimental evidence on financial incentives, information and decision-making," The British Accounting Review, Elsevier, vol. 41(2), pages 71-89.
    4. Dutta, Sujay, 2012. "Vulnerability to Low-Price Signals: An Experimental Study of the Effectiveness of Genuine and Deceptive Signals," Journal of Retailing, Elsevier, vol. 88(1), pages 156-167.
    5. Bonner, Sarah E. & Sprinkle, Geoffrey B., 2002. "The effects of monetary incentives on effort and task performance: theories, evidence, and a framework for research," Accounting, Organizations and Society, Elsevier, vol. 27(4-5), pages 303-345.
    6. Gomaa, Mohamed I. & Hunton, James E. & Vaassen, Eddy H.J. & Carree, Martin A., 2011. "Decision aid reliance: Modeling the effects of decision aid reliability and pressures to perform on reliance behavior," International Journal of Accounting Information Systems, Elsevier, vol. 12(3), pages 206-224.
    7. Boatsman, James R. & Moeckel, Cindy & Pei, Buck K. W., 1997. "The Effects of Decision Consequences on Auditors' Reliance on Decision Aids in Audit Planning," Organizational Behavior and Human Decision Processes, Elsevier, vol. 71(2), pages 211-247, August.
    8. DeZoort, Todd & Harrison, Paul & Taylor, Mark, 2006. "Accountability and auditors' materiality judgments: The effects of differential pressure strength on conservatism, variability, and effort," Accounting, Organizations and Society, Elsevier, vol. 31(4-5), pages 373-390.
    9. Whitecotton, Stacey M. & Sanders, D. Elaine & Norris, Kathleen B., 1998. "Improving Predictive Accuracy with a Combination of Human Intuition and Mechanical Decision Aids," Organizational Behavior and Human Decision Processes, Elsevier, vol. 76(3), pages 325-348, December.
    10. Mahmud, Hasan & Islam, A.K.M. Najmul & Ahmed, Syed Ishtiaque & Smolander, Kari, 2022. "What influences algorithmic decision-making? A systematic literature review on algorithm aversion," Technological Forecasting and Social Change, Elsevier, vol. 175(C).
    11. Markus Jung & Mischa Seiter, 2021. "Towards a better understanding on mitigating algorithm aversion in forecasting: an experimental study," Journal of Management Control: Zeitschrift für Planung und Unternehmenssteuerung, Springer, vol. 32(4), pages 495-516, December.
    12. Lawrence, Michael & Goodwin, Paul & Fildes, Robert, 2002. "Influence of user participation on DSS use and decision accuracy," Omega, Elsevier, vol. 30(5), pages 381-392, October.
    13. Robert M. Gillenkirch & Julia Ortner & Sebastian Robert & Louis Velthuis, 2023. "Designing incentives and performance measurement for advisors: How to make decision-makers listen to advice," Working Papers 2304, Gutenberg School of Management and Economics, Johannes Gutenberg-Universität Mainz.
    14. Glover, Steven M. & Prawitt, Douglas F. & Spilker, Brian C., 1997. "The Influence of Decision Aids on User Behavior: Implications for Knowledge Acquisition and Inappropriate Reliance," Organizational Behavior and Human Decision Processes, Elsevier, vol. 72(2), pages 232-255, November.
    15. Hal R. Arkes & Victoria A. Shaffer & Mitchell A. Medow, 2007. "Patients Derogate Physicians Who Use a Computer-Assisted Diagnostic Aid," Medical Decision Making, , vol. 27(2), pages 189-202, March.
    16. Armstrong, J. Scott & Brodie, Roderick J., 1994. "Effects of portfolio planning methods on decision making: experimental results," MPRA Paper 81684, University Library of Munich, Germany.
    17. Mauldin, Elaine G. & Ruchala, Linda V., 1999. "Towards a meta-theory of accounting information systems," Accounting, Organizations and Society, Elsevier, vol. 24(4), pages 317-331, May.
    18. Glenn Boyle & Gerald Ward, 2018. "Do Better Informed Investors Always Do Better? A Buyback Puzzle," Economic Inquiry, Western Economic Association International, vol. 56(4), pages 2137-2157, October.
    19. Benjamin Enke & Uri Gneezy & Brian Hall & David Martin & Vadim Nelidov & Theo Offerman & Jeroen van de Ven, 2020. "Cognitive Biases: Mistakes or Missing Stakes?," CESifo Working Paper Series 8168, CESifo.
    20. Itzhak Ben-David & John R. Graham & Campbell R. Harvey, 2007. "Managerial Overconfidence and Corporate Policies," NBER Working Papers 13711, National Bureau of Economic Research, Inc.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:medema:v:26:y:2006:i:1:p:48-56. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.