IDEAS home Printed from https://ideas.repec.org/p/ags/aaea21/312913.html
   My bibliography  Save this paper

Comparing Water Quality Valuation Across Probability and Non-Probability Samples

Author

Listed:
  • Sandstrom, Kaitlynn M.A.
  • Lupi, Frank

Abstract

We compare water quality valuation results from a probability sample and two opt‐in non‐probability samples, MTurk and Qualtrics. The samples differ in some key demographics, but measured attitudes are strikingly similar. For valuation models, most parameters were significantly different across samples, yet many of the marginal willingness to pay were similar across samples. Notably, for non‐marginal changes there were some differences by samples: MTurk values were always significantly greater than the probability sample, as were Qualtrics values for changes up to about a 20% improvement. Overall, the evidence is mixed, with some key differences but many similarities across samples.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Sandstrom, Kaitlynn M.A. & Lupi, Frank, 2021. "Comparing Water Quality Valuation Across Probability and Non-Probability Samples," 2021 Annual Meeting, August 1-3, Austin, Texas 312913, Agricultural and Applied Economics Association.
  • Handle: RePEc:ags:aaea21:312913
    DOI: 10.22004/ag.econ.312913
    as

    Download full text from publisher

    File URL: https://ageconsearch.umn.edu/record/312913/files/Sandstrom%20et%20al_AAEA_2021_revised-7-30-21.pdf
    Download Restriction: no

    File URL: https://libkey.io/10.22004/ag.econ.312913?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Zhifeng Gao & Lisa A. House & Jing Xie, 2016. "Online Survey Data Quality and Its Implication for Willingness-to-Pay: A Cross-Country Comparison," Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie, Canadian Agricultural Economics Society/Societe canadienne d'agroeconomie, vol. 64(2), pages 199-221, June.
    2. Lindhjem, Henrik & Navrud, Ståle, 2011. "Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes," International Review of Environmental and Resource Economics, now publishers, vol. 5(4), pages 309-351, September.
    3. Boas, Taylor C. & Christenson, Dino P. & Glick, David M., 2020. "Recruiting large online samples in the United States and India: Facebook, Mechanical Turk, and Qualtrics," Political Science Research and Methods, Cambridge University Press, vol. 8(2), pages 232-250, April.
    4. David Johnson & John Barry Ryan, 2020. "Amazon Mechanical Turk workers can provide consistent and economically meaningful data," Southern Economic Journal, John Wiley & Sons, vol. 87(1), pages 369-385, July.
    5. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    6. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    7. John Gibson & David Johnson, 2019. "Are Online Samples Credible? Evidence from Risk Elicitation Tests," Atlantic Economic Journal, Springer;International Atlantic Economic Society, vol. 47(3), pages 377-379, September.
    8. Roulin, Nicolas, 2015. "Don't Throw the Baby Out With the Bathwater: Comparing Data Quality of Crowdsourcing, Online Panels, and Student Samples," Industrial and Organizational Psychology, Cambridge University Press, vol. 8(2), pages 190-196, June.
    9. Søren Olsen, 2009. "Choosing Between Internet and Mail Survey Modes for Choice Experiment Surveys Considering Non-Market Goods," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 44(4), pages 591-610, December.
    10. Liebe, Ulf & Glenk, Klaus & Oehlmann, Malte & Meyerhoff, Jürgen, 2015. "Does the use of mobile devices (tablets and smartphones) affect survey quality and choice behaviour in web surveys?," Journal of choice modelling, Elsevier, vol. 14(C), pages 17-31.
    11. Sandorf, Erlend Dancke & Persson, Lars & Broberg, Thomas, 2020. "Using an integrated choice and latent variable model to understand the impact of “professional” respondents in a stated preference survey," Resource and Energy Economics, Elsevier, vol. 61(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Plaga, Leonie Sara & Lynch, Muireann & Curtis, John & Bertsch, Valentin, 2024. "How public acceptance affects power system development—A cross-country analysis for wind power," Applied Energy, Elsevier, vol. 359(C).
    2. Curtis, John & Grilli, Gianluca & Lynch, Muireann Á, 2024. "Residential renovations: understanding cost-disruption trade-offs," Papers WP776, Economic and Social Research Institute (ESRI).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Skeie, Magnus Aa. & Lindhjem, Henrik & Skjeflo, Sofie & Navrud, Ståle, 2019. "Smartphone and tablet effects in contingent valuation web surveys – No reason to worry?," Ecological Economics, Elsevier, vol. 165(C), pages 1-1.
    2. Menegaki, Angeliki, N. & Olsen, Søren Bøye & Tsagarakis, Konstantinos P., 2016. "Towards a common standard – A reporting checklist for web-based stated preference valuation surveys and a critique for mode surveys," Journal of choice modelling, Elsevier, vol. 18(C), pages 18-50.
    3. Erlend Dancke Sandorf & Kristine Grimsrud & Henrik Lindhjem, 2022. "Ponderous, Proficient or Professional? Survey Experience and Smartphone Effects in Stated Preference Research," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 81(4), pages 807-832, April.
    4. Kelvin Balcombe & Michail Bitzios & Iain Fraser & Janet Haddock-Fraser, 2014. "Using Attribute Importance Rankings Within Discrete Choice Experiments: An Application to Valuing Bread Attributes," Journal of Agricultural Economics, Wiley Blackwell, vol. 65(2), pages 446-462, June.
    5. Penn, Jerrod & Hu, Wuyang, 2016. "Making the Most of Cheap Talk in an Online Survey," 2016 Annual Meeting, July 31-August 2, Boston, Massachusetts 236171, Agricultural and Applied Economics Association.
    6. Chen, Xuqi & Shen, Meng & Gao, Zhifeng, 2017. "Impact of Intra-respondent Variations in Attribute Attendance on Consumer Preference in Food Choice," 2017 Annual Meeting, July 30-August 1, Chicago, Illinois 258509, Agricultural and Applied Economics Association.
    7. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    8. Sandorf, Erlend Dancke & Persson, Lars & Broberg, Thomas, 2020. "Using an integrated choice and latent variable model to understand the impact of “professional” respondents in a stated preference survey," Resource and Energy Economics, Elsevier, vol. 61(C).
    9. Gordon Pennycook & David G. Rand, 2022. "Accuracy prompts are a replicable and generalizable approach for reducing the spread of misinformation," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    10. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    11. Subroy, Vandana & Gunawardena, Asha & Polyakov, Maksym & Pandit, Ram & Pannell, David J., 2019. "The worth of wildlife: A meta-analysis of global non-market values of threatened species," Ecological Economics, Elsevier, vol. 164(C), pages 1-1.
    12. Kolstoe, Sonja & Naald, Brian Vander & Cohan, Alison, 2022. "A tale of two samples: Understanding WTP differences in the age of social media," Ecosystem Services, Elsevier, vol. 55(C).
    13. John Gibson & David Johnson, 0. "Breaking Bad: When Being Disadvantaged Incentivizes (Seemingly) Risky Behavior," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 0, pages 1-28.
    14. Barton, Jared & Pan, Xiaofei, 2022. "Movin’ on up? A survey experiment on mobility enhancing policies," European Journal of Political Economy, Elsevier, vol. 74(C).
    15. Carlsson, Fredrik & Kataria, Mitesh & Lampi, Elina & Martinsson, Peter, 2021. "Past and present outage costs – A follow-up study of households’ willingness to pay to avoid power outages," Resource and Energy Economics, Elsevier, vol. 64(C).
    16. Mjelde & Tae-Kyun Kim & Choong-Ki Lee, 2016. "Comparison of Internet and interview survey modes when estimating willingness to pay using choice experiments," Applied Economics Letters, Taylor & Francis Journals, vol. 23(1), pages 74-77, January.
    17. Jed J. Cohen & Johannes Reichl, 2022. "Comparing Internet and phone survey mode effects across countries and research contexts," Australian Journal of Agricultural and Resource Economics, Australian Agricultural and Resource Economics Society, vol. 66(1), pages 44-71, January.
    18. Johannes G. Jaspersen & Marc A. Ragin & Justin R. Sydnor, 2022. "Insurance demand experiments: Comparing crowdworking to the lab," Journal of Risk & Insurance, The American Risk and Insurance Association, vol. 89(4), pages 1077-1107, December.
    19. Liebe, Ulf & Glenk, Klaus & von Meyer-Höfer, Marie & Spiller, Achim, 2019. "A web survey application of real choice experiments," Journal of choice modelling, Elsevier, vol. 33(C).
    20. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).

    More about this item

    Keywords

    Environmental Economics and Policy; Resource/Energy Economics and Policy; Research Methods/Statistical Methods;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ags:aaea21:312913. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: AgEcon Search (email available below). General contact details of provider: https://edirc.repec.org/data/aaeaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.