IDEAS home Printed from https://ideas.repec.org/a/wly/apecpp/v45y2023i2p744-761.html
   My bibliography  Save this article

Comparing water quality valuation across probability and non‐probability samples

Author

Listed:
  • Kaitlynn Sandstrom‐Mistry
  • Frank Lupi
  • Hyunjung Kim
  • Joseph A. Herriges

Abstract

We compare water quality valuation results from a probability sample and two opt‐in non‐probability samples, MTurk and Qualtrics. The samples differ in some key demographics, but measured attitudes are strikingly similar. For valuation models, most parameters were significantly different across samples, yet many of the marginal willingness to pay were similar across samples. Notably, for non‐marginal changes there were some differences by samples: MTurk values were always significantly greater than the probability sample, as were Qualtrics values for changes up to about a 20% improvement. Overall, the evidence is mixed, with some key differences but many similarities across samples.

Suggested Citation

  • Kaitlynn Sandstrom‐Mistry & Frank Lupi & Hyunjung Kim & Joseph A. Herriges, 2023. "Comparing water quality valuation across probability and non‐probability samples," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 45(2), pages 744-761, June.
  • Handle: RePEc:wly:apecpp:v:45:y:2023:i:2:p:744-761
    DOI: 10.1002/aepp.13375
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/aepp.13375
    Download Restriction: no

    File URL: https://libkey.io/10.1002/aepp.13375?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Zhifeng Gao & Lisa A. House & Jing Xie, 2016. "Online Survey Data Quality and Its Implication for Willingness-to-Pay: A Cross-Country Comparison," Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie, Canadian Agricultural Economics Society/Societe canadienne d'agroeconomie, vol. 64(2), pages 199-221, June.
    2. Lindhjem, Henrik & Navrud, Ståle, 2011. "Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes," International Review of Environmental and Resource Economics, now publishers, vol. 5(4), pages 309-351, September.
    3. John Gibson & David Johnson, 2019. "Are Online Samples Credible? Evidence from Risk Elicitation Tests," Atlantic Economic Journal, Springer;International Atlantic Economic Society, vol. 47(3), pages 377-379, September.
    4. Roulin, Nicolas, 2015. "Don't Throw the Baby Out With the Bathwater: Comparing Data Quality of Crowdsourcing, Online Panels, and Student Samples," Industrial and Organizational Psychology, Cambridge University Press, vol. 8(2), pages 190-196, June.
    5. David Johnson & John Barry Ryan, 2020. "Amazon Mechanical Turk workers can provide consistent and economically meaningful data," Southern Economic Journal, John Wiley & Sons, vol. 87(1), pages 369-385, July.
    6. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    7. Liebe, Ulf & Glenk, Klaus & Oehlmann, Malte & Meyerhoff, Jürgen, 2015. "Does the use of mobile devices (tablets and smartphones) affect survey quality and choice behaviour in web surveys?," Journal of choice modelling, Elsevier, vol. 14(C), pages 17-31.
    8. Boas, Taylor C. & Christenson, Dino P. & Glick, David M., 2020. "Recruiting large online samples in the United States and India: Facebook, Mechanical Turk, and Qualtrics," Political Science Research and Methods, Cambridge University Press, vol. 8(2), pages 232-250, April.
    9. Sandorf, Erlend Dancke & Persson, Lars & Broberg, Thomas, 2020. "Using an integrated choice and latent variable model to understand the impact of “professional” respondents in a stated preference survey," Resource and Energy Economics, Elsevier, vol. 61(C).
    10. Søren Olsen, 2009. "Choosing Between Internet and Mail Survey Modes for Choice Experiment Surveys Considering Non-Market Goods," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 44(4), pages 591-610, December.
    11. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Erlend Dancke Sandorf & Kristine Grimsrud & Henrik Lindhjem, 2022. "Ponderous, Proficient or Professional? Survey Experience and Smartphone Effects in Stated Preference Research," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 81(4), pages 807-832, April.
    2. Skeie, Magnus Aa. & Lindhjem, Henrik & Skjeflo, Sofie & Navrud, Ståle, 2019. "Smartphone and tablet effects in contingent valuation web surveys – No reason to worry?," Ecological Economics, Elsevier, vol. 165(C), pages 1-1.
    3. Menegaki, Angeliki, N. & Olsen, Søren Bøye & Tsagarakis, Konstantinos P., 2016. "Towards a common standard – A reporting checklist for web-based stated preference valuation surveys and a critique for mode surveys," Journal of choice modelling, Elsevier, vol. 18(C), pages 18-50.
    4. Kelvin Balcombe & Michail Bitzios & Iain Fraser & Janet Haddock-Fraser, 2014. "Using Attribute Importance Rankings Within Discrete Choice Experiments: An Application to Valuing Bread Attributes," Journal of Agricultural Economics, Wiley Blackwell, vol. 65(2), pages 446-462, June.
    5. Chen, Xuqi & Shen, Meng & Gao, Zhifeng, 2017. "Impact of Intra-respondent Variations in Attribute Attendance on Consumer Preference in Food Choice," 2017 Annual Meeting, July 30-August 1, Chicago, Illinois 258509, Agricultural and Applied Economics Association.
    6. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    7. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    8. John Gibson & David Johnson, 0. "Breaking Bad: When Being Disadvantaged Incentivizes (Seemingly) Risky Behavior," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 0, pages 1-28.
    9. Barton, Jared & Pan, Xiaofei, 2022. "Movin’ on up? A survey experiment on mobility enhancing policies," European Journal of Political Economy, Elsevier, vol. 74(C).
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    11. John Gibson & David Johnson, 2021. "Breaking Bad: When Being Disadvantaged Incentivizes (Seemingly) Risky Behavior," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 47(1), pages 107-134, January.
    12. Carlsson, Fredrik & Kataria, Mitesh & Lampi, Elina & Martinsson, Peter, 2021. "Past and present outage costs – A follow-up study of households’ willingness to pay to avoid power outages," Resource and Energy Economics, Elsevier, vol. 64(C).
    13. Subroy, Vandana & Gunawardena, Asha & Polyakov, Maksym & Pandit, Ram & Pannell, David J., 2019. "The worth of wildlife: A meta-analysis of global non-market values of threatened species," Ecological Economics, Elsevier, vol. 164(C), pages 1-1.
    14. Johannes G. Jaspersen & Marc A. Ragin & Justin R. Sydnor, 2022. "Insurance demand experiments: Comparing crowdworking to the lab," Journal of Risk & Insurance, The American Risk and Insurance Association, vol. 89(4), pages 1077-1107, December.
    15. David Johnson & John Barry Ryan, 2020. "Amazon Mechanical Turk workers can provide consistent and economically meaningful data," Southern Economic Journal, John Wiley & Sons, vol. 87(1), pages 369-385, July.
    16. Luke Fowler & Stephen Utych, 2021. "Are people better employees than machines? Dehumanizing language and employee performance appraisals," Social Science Quarterly, Southwestern Social Science Association, vol. 102(4), pages 2006-2019, July.
    17. Penn, Jerrod & Hu, Wuyang, 2016. "Making the Most of Cheap Talk in an Online Survey," 2016 Annual Meeting, July 31-August 2, Boston, Massachusetts 236171, Agricultural and Applied Economics Association.
    18. Sandorf, Erlend Dancke & Persson, Lars & Broberg, Thomas, 2020. "Using an integrated choice and latent variable model to understand the impact of “professional” respondents in a stated preference survey," Resource and Energy Economics, Elsevier, vol. 61(C).
    19. Mjelde & Tae-Kyun Kim & Choong-Ki Lee, 2016. "Comparison of Internet and interview survey modes when estimating willingness to pay using choice experiments," Applied Economics Letters, Taylor & Francis Journals, vol. 23(1), pages 74-77, January.
    20. Jed J. Cohen & Johannes Reichl, 2022. "Comparing Internet and phone survey mode effects across countries and research contexts," Australian Journal of Agricultural and Resource Economics, Australian Agricultural and Resource Economics Society, vol. 66(1), pages 44-71, January.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:apecpp:v:45:y:2023:i:2:p:744-761. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://doi.org/10.1002/(ISSN)2040-5804 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.