IDEAS home Printed from https://ideas.repec.org/p/war/wpaper/2026-4.html

Behavioral Biases in Stated Preference Valuation of Mortality Risk Reductions: Cost Vector, Anchoring, and Scope Effects

Author

Listed:
  • Wojciech Zawadzki

    (Faculty of Economic Sciences, University of Warsaw)

  • Henrik Andersson

    (Swedish National Road and Transport Research Institute (VTI)
    Toulouse School of Economics, University of Toulouse Capitole)

  • Mikołaj Czajkowski

    (Faculty of Economic Sciences, University of Warsaw)

  • Arne Risa Hole

    (Universitat Jaume I)

Abstract

This study investigates how behavioral biases influence stated preference valuation of mortality risk reductions, commonly summarized as the value of a statistical life (VSL). Using a discrete choice experiment (DCE) combined with a contingent valuation double-bounded dichotomous choice and an open-ended follow-up, we elicit individuals’ willingness to pay (WTP) for cardiovascular mortality risk reductions. In a randomized design, we varied the cost attribute across three cost range treatments and manipulated information disclosure and feedback to examine three behavioral phenomena: cost vector effects (whether the range of costs presented affects WTP), scope insensitivity (whether WTP scales appropriately with the magnitude of the risk reduction), and anchoring (whether initial cost cues affect subsequent responses). Our results show that mean VSL estimates can vary by up to ~25% between cost treatments. Furthermore, WTP responses exhibit partial scope insensitivity – larger risk reductions do not proportionally increase WTP – indicating a deviation from theoretical expectations. Importantly, we find no strong evidence of anchoring: neither revealing all attribute levels upfront, nor starting with extreme cost levels, nor providing feedback on quiz questions significantly affected respondents’ choices or WTP. Our findings underscore the need for careful survey design. Even if VSL distributions remain statistically similar across cost frames, substantial shifts in mean magnitudes could be consequential for policy. We call for standardized guidelines on cost attribute selection and survey protocols to mitigate bias, ensuring that stated preference methods yield reliable welfare estimates for health policy decisions.

Suggested Citation

  • Wojciech Zawadzki & Henrik Andersson & Mikołaj Czajkowski & Arne Risa Hole, 2026. "Behavioral Biases in Stated Preference Valuation of Mortality Risk Reductions: Cost Vector, Anchoring, and Scope Effects," Working Papers 2026-4, Faculty of Economic Sciences, University of Warsaw.
  • Handle: RePEc:war:wpaper:2026-4
    as

    Download full text from publisher

    File URL: https://www.wne.uw.edu.pl/download_file/6997/0
    File Function: First version, 2026
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Thomas J. Kniesner, 2019. "Behavioral economics and the value of a statistical life," Journal of Risk and Uncertainty, Springer, vol. 58(2), pages 207-217, June.
    2. Onwujekwe, Obinna & Nwagbo, Douglas, 2002. "Investigating starting-point bias: a survey of willingness to pay for insecticide-treated nets," Social Science & Medicine, Elsevier, vol. 55(12), pages 2121-2130, December.
    3. Flachaire, Emmanuel & Hollard, Guillaume, 2007. "Starting point bias and respondent uncertainty in dichotomous choice contingent valuation surveys," Resource and Energy Economics, Elsevier, vol. 29(3), pages 183-194, September.
    4. Emily Lancsar & Jordan Louviere, 2008. "Conducting Discrete Choice Experiments to Inform Healthcare Decision Making," PharmacoEconomics, Springer, vol. 26(8), pages 661-677, August.
    5. Hammitt, James K & Graham, John D, 1999. "Willingness to Pay for Health Protection: Inadequate Sensitivity to Probability?," Journal of Risk and Uncertainty, Springer, vol. 18(1), pages 33-62, April.
    6. Mikołaj Czajkowski & Nick Hanley, 2009. "Using Labels to Investigate Scope Effects in Stated Preference Methods," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 44(4), pages 521-535, December.
    7. Glenk, Klaus & Meyerhoff, Jürgen & Akaichi, Faical & Martin-Ortega, Julia, 2019. "Revisiting cost vector effects in discrete choice experiments," Resource and Energy Economics, Elsevier, vol. 57(C), pages 135-155.
    8. Fredrik Carlsson & Peter Martinsson, 2008. "How Much is Too Much?," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 40(2), pages 165-176, June.
    9. Glenk, Klaus & Meyerhoff, Jürgen & Colombo, Sergio & Faccioli, Michela, 2024. "Enhancing the face validity of choice experiments: A simple diagnostic check," Ecological Economics, Elsevier, vol. 221(C).
    10. Shah, Koonal K. & Tsuchiya, Aki & Wailoo, Allan J., 2015. "Valuing health at the end of life: A stated preference discrete choice experiment," Social Science & Medicine, Elsevier, vol. 124(C), pages 48-56.
    11. Train,Kenneth E., 2009. "Discrete Choice Methods with Simulation," Cambridge Books, Cambridge University Press, number 9780521747387, Enero-Abr.
    12. Isabell Goldberg & Jutta Roosen, 2007. "Scope insensitivity in health risk reduction studies: A comparison of choice experiments and the contingent valuation method for valuing safer food," Journal of Risk and Uncertainty, Springer, vol. 34(2), pages 123-144, April.
    13. Dan Ariely & George Loewenstein & Drazen Prelec, 2003. ""Coherent Arbitrariness": Stable Demand Curves Without Stable Preferences," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 118(1), pages 73-106.
    14. John List & Craig Gallet, 2001. "What Experimental Protocol Influence Disparities Between Actual and Hypothetical Stated Values?," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 20(3), pages 241-254, November.
    15. Krinsky, Itzhak & Robb, A Leslie, 1991. "Three Methods for Calculating the Statistical Properties of Elasticities: A Comparison," Empirical Economics, Springer, vol. 16(2), pages 199-209.
    16. Arne Risa Hole, 2007. "A comparison of approaches to estimating confidence intervals for willingness to pay measures," Health Economics, John Wiley & Sons, Ltd., vol. 16(8), pages 827-840, August.
    17. Klose, Thomas, 1999. "The contingent valuation method in health care," Health Policy, Elsevier, vol. 47(2), pages 97-123, May.
    18. Meyerhoff, Jürgen & Glenk, Klaus, 2015. "Learning how to choose—effects of instructional choice sets in discrete choice experiments," Resource and Energy Economics, Elsevier, vol. 41(C), pages 122-142.
    19. Hanley, Nick & Adamowicz, Wiktor & Wright, Robert E., 2005. "Price vector effects in choice experiments: an empirical test," Resource and Energy Economics, Elsevier, vol. 27(3), pages 227-234, October.
    20. Alan Diener & Bernie O'Brien & Amiram Gafni, 1998. "Health care contingent valuation studies: a review and classification of the literature," Health Economics, John Wiley & Sons, Ltd., vol. 7(4), pages 313-326, June.
    21. Richard Carson & Theodore Groves, 2007. "Incentive and informational properties of preference questions," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 37(1), pages 181-210, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Andersson, Henrik & Hole, Arne Risa & Svensson, Mikael, 2016. "Valuation of small and multiple health risks: A critical analysis of SP data applied to food and water safety," Journal of Environmental Economics and Management, Elsevier, vol. 75(C), pages 41-53.
    2. Svenningsen, Lea S. & Jacobsen, Jette Bredahl, 2018. "Testing the effect of changes in elicitation format, payment vehicle and bid range on the hypothetical bias for moral goods," Journal of choice modelling, Elsevier, vol. 29(C), pages 17-32.
    3. Lopez-Becerra, E.I. & Alcon, F., 2021. "Social desirability bias in the environmental economic valuation: An inferred valuation approach," Ecological Economics, Elsevier, vol. 184(C).
    4. Sergio Colombo & Wiktor Budziński & Mikołaj Czajkowski & Klaus Glenk, 2022. "The relative performance of ex‐ante and ex‐post measures to mitigate hypothetical and strategic bias in a stated preference study," Journal of Agricultural Economics, Wiley Blackwell, vol. 73(3), pages 845-873, September.
    5. Ahtiainen, Heini & Pouta, Eija & Zawadzki, Wojciech & Tienhaara, Annika, 2023. "Cost vector effects in discrete choice experiments with positive status quo cost," Journal of choice modelling, Elsevier, vol. 47(C).
    6. Martinet, Vincent & David, Maïa & Mermet-Bijon, Vincent & Crastes Dit Sourd, Romain, 2025. "Cost vector effects in forced-choice discrete choice experiments: Assessing the acceptability of future glyphosate policies," Journal of choice modelling, Elsevier, vol. 55(C).
    7. Glenk, Klaus & Meyerhoff, Jürgen & Akaichi, Faical & Martin-Ortega, Julia, 2019. "Revisiting cost vector effects in discrete choice experiments," Resource and Energy Economics, Elsevier, vol. 57(C), pages 135-155.
    8. Robert J. Johnston & Kevin J. Boyle & Wiktor (Vic) Adamowicz & Jeff Bennett & Roy Brouwer & Trudy Ann Cameron & W. Michael Hanemann & Nick Hanley & Mandy Ryan & Riccardo Scarpa & Roger Tourangeau & Ch, 2017. "Contemporary Guidance for Stated Preference Studies," Journal of the Association of Environmental and Resource Economists, University of Chicago Press, vol. 4(2), pages 319-405.
    9. Carlsson, Fredrik & Raun Mørkbak, Morten & Bøye Olsen, Søren, 2010. "The first time is the hardest: A test of ordering effects in choice experiments," Working Papers in Economics 470, University of Gothenburg, Department of Economics.
    10. Hoyos, David, 2010. "The state of the art of environmental valuation with discrete choice experiments," Ecological Economics, Elsevier, vol. 69(8), pages 1595-1603, June.
    11. Hammitt, James K. & Herrera-Araujo, Daniel, 2018. "Peeling back the onion: Using latent class analysis to uncover heterogeneous responses to stated preference surveys," Journal of Environmental Economics and Management, Elsevier, vol. 87(C), pages 165-189.
    12. Marit Kragt, 2013. "The Effects of Changing Cost Vectors on Choices and Scale Heterogeneity," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 54(2), pages 201-221, February.
    13. Ladenburg, Jacob & Olsen, Søren Bøye, 2008. "Gender-specific starting point bias in choice experiments: Evidence from an empirical study," Journal of Environmental Economics and Management, Elsevier, vol. 56(3), pages 275-285, November.
    14. Meyerhoff, Jürgen & Glenk, Klaus, 2015. "Learning how to choose—effects of instructional choice sets in discrete choice experiments," Resource and Energy Economics, Elsevier, vol. 41(C), pages 122-142.
    15. Sergio Colombo & Wiktor Budziński & Mikołaj Czajkowski & Klaus Glenk, 2020. "Ex-ante and ex-post measures to mitigate hypothetical bias. Are they alternative or complementary tools to increase the reliability and validity of DCE estimates?," Working Papers 2020-20, Faculty of Economic Sciences, University of Warsaw.
    16. Tomasz Gajderowicz & Gabriela Grotkowska, 2019. "Polarization of Tastes: Stated Preference Stability in Sequential Discrete Choices," European Research Studies Journal, European Research Studies Journal, vol. 0(4), pages 70-87.
    17. Hoyos Ramos, David, 2010. "Using discrete choice experiments for environmental valuation," BILTOKI 1134-8984, Universidad del País Vasco - Departamento de Economía Aplicada III (Econometría y Estadística).
    18. Nicolas Krucien & Amiram Gafni & Nathalie Pelletier‐Fleury, 2015. "Empirical Testing of the External Validity of a Discrete Choice Experiment to Determine Preferred Treatment Option: The Case of Sleep Apnea," Health Economics, John Wiley & Sons, Ltd., vol. 24(8), pages 951-965, August.
    19. Roy Brouwer & Solomon Tarfasa, 2020. "Testing hypothetical bias in a framed field experiment," Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie, Canadian Agricultural Economics Society/Societe canadienne d'agroeconomie, vol. 68(3), pages 343-357, September.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;
    ;
    ;
    ;
    ;

    JEL classification:

    • I12 - Health, Education, and Welfare - - Health - - - Health Behavior
    • D01 - Microeconomics - - General - - - Microeconomic Behavior: Underlying Principles
    • D61 - Microeconomics - - Welfare Economics - - - Allocative Efficiency; Cost-Benefit Analysis
    • Q51 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - Environmental Economics - - - Valuation of Environmental Effects
    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:war:wpaper:2026-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Jacek Rapacz (email available below). General contact details of provider: https://edirc.repec.org/data/fesuwpl.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.