IDEAS home Printed from https://ideas.repec.org/a/sae/somere/v40y2011i1p105-137.html
   My bibliography  Save this article

Estimating Propensity Adjustments for Volunteer Web Surveys

Author

Listed:
  • Richard Valliant

    (University of Michigan and University of Maryland, College Park, MD, USA, rvalliant@survey.umd.edu)

  • Jill A. Dever

    (RTI International, Washington, DC, USA)

Abstract

Panels of persons who volunteer to participate in Web surveys are used to make estimates for entire populations, including persons who have no access to the Internet. One method of adjusting a volunteer sample to attempt to make it representative of a larger population involves randomly selecting a reference sample from the larger population. The act of volunteering is treated as a quasi-random process where each person has some probability of volunteering. One option for computing weights for the volunteers is to combine the reference sample and Web volunteers and estimate probabilities of being a Web volunteer via propensity modeling. There are several options for using the estimated propensities to estimate population quantities. Careful analysis to justify these methods is lacking. The goals of this article are (a) to identify the assumptions and techniques of estimation that will lead to correct inference under the quasi-random approach, (b) to explore whether methods used in practice are biased, and (c) to illustrate the performance of some estimators that use estimated propensities. Two of our main findings are (a) that estimators of means based on estimates of propensity models that do not use the weights associated with the reference sample are biased even when the probability of volunteering is correctly modeled and (b) if the probability of volunteering is associated with analysis variables collected in the volunteer survey, propensity modeling does not correct bias.

Suggested Citation

  • Richard Valliant & Jill A. Dever, 2011. "Estimating Propensity Adjustments for Volunteer Web Surveys," Sociological Methods & Research, , vol. 40(1), pages 105-137, February.
  • Handle: RePEc:sae:somere:v:40:y:2011:i:1:p:105-137
    DOI: 10.1177/0049124110392533
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0049124110392533
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0049124110392533?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Matthias Schonlau & Arthur Van Soest & Arie Kapteyn, 2007. "Are 'Webographic' or Attitudinal Questions Useful for Adjusting Estimates From Web Surveys Using Propensity Scoring?," Working Papers 506, RAND Corporation.
    2. J. B. Copas & H. G. Li, 1997. "Inference for Non‐random Samples," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 59(1), pages 55-95.
    3. Matthias Schonlau & Arthur van Soest & Arie Kapteyn & Mick Couper, 2009. "Selection Bias in Web Surveys and the Use of Propensity Scores," Sociological Methods & Research, , vol. 37(3), pages 291-318, February.
    4. Börsch-Supan, Axel & Winter, Joachim, 2004. "How to make internet surveys representative: A case study of a two-step weighting procedure," MEA discussion paper series 04067, Munich Center for the Economics of Aging (MEA) at the Max Planck Institute for Social Law and Social Policy.
    5. Matthias Schonlau & Arthur Van Soest & Arie Kapteyn, 2007. "Are 'Webographic' or Attitudinal Questions Useful for Adjusting Estimates From Web Surveys Using Propensity Scoring?," Working Papers WR-506, RAND Corporation.
    6. Czajka, John L, et al, 1992. "Projecting from Advance Data Using Propensity Modeling: An Application to Income and Tax Statistics," Journal of Business & Economic Statistics, American Statistical Association, vol. 10(2), pages 117-131, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Maria del Mar Rueda, 2019. "Comments on: Deville and Särndal’s calibration: revisiting a 25 years old successful optimization problem," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 28(4), pages 1077-1081, December.
    2. Magdalena Smyk & Joanna Tyrowicz & Lucas van der Velde, 2021. "A Cautionary Note on the Reliability of the Online Survey Data: The Case of Wage Indicator," Sociological Methods & Research, , vol. 50(1), pages 429-464, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stéphane Legleye & Géraldine Charrance & Nicolas Razafindratsima & Nathalie Bajos & Aline Bohet & Caroline Moreau, 2018. "The Use of a Nonprobability Internet Panel to Monitor Sexual and Reproductive Health in the General Population," Sociological Methods & Research, , vol. 47(2), pages 314-348, March.
    2. Sunghee Lee & Richard Valliant, 2009. "Estimation for Volunteer Panel Web Surveys Using Propensity Score Adjustment and Calibration Adjustment," Sociological Methods & Research, , vol. 37(3), pages 319-343, February.
    3. repec:aia:aiaswp:wp76 is not listed on IDEAS
    4. Ramón Ferri-García & María del Mar Rueda, 2022. "Variable selection in Propensity Score Adjustment to mitigate selection bias in online surveys," Statistical Papers, Springer, vol. 63(6), pages 1829-1881, December.
    5. Luis Castro-Martín & Maria del Mar Rueda & Ramón Ferri-García, 2020. "Inference from Non-Probability Surveys with Statistical Matching and Propensity Score Adjustment Using Modern Prediction Techniques," Mathematics, MDPI, vol. 8(6), pages 1-19, June.
    6. Buil-Gil, David & Solymosi, Reka & Moretti, Angelo, 2019. "Non-parametric bootstrap and small area estimation to mitigate bias in crowdsourced data. Simulation study and application to perceived safety," SocArXiv 8hgjt, Center for Open Science.
    7. Amang Sukasih & Donsig Jang & Sonya Vartivarian & Stephen Cohen & Fan Zhang, "undated". "A Simulation Study to Compare Weighting Methods for Survey Nonresponses in the National Survey of Recent College Graduates," Mathematica Policy Research Reports 613f000cac94492f91b53813f, Mathematica Policy Research.
    8. Amarendra Sharma, 2019. "Indira Awas Yojana and Housing Adequacy: An Evaluation using Propensity Score Matching," ASARC Working Papers 2019-05, The Australian National University, Australia South Asia Research Centre.
    9. Crossley, Thomas F. & Fisher, Paul & Low, Hamish, 2021. "The heterogeneous and regressive consequences of COVID-19: Evidence from high quality panel data," Journal of Public Economics, Elsevier, vol. 193(C).
    10. Guzi, Martin & de Pedraza, Pablo, 2013. "A Web Survey Analysis of the Subjective Well-being of Spanish Workers," IZA Discussion Papers 7618, Institute of Labor Economics (IZA).
    11. Hirschauer, Norbert & Grüner, Sven & Mußhoff, Oliver & Becker, Claudia & Jantsch, Antje, 2020. "Can p-values be meaningfully interpreted without random sampling?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 14, pages 71-91.
    12. Maciej Berȩsewicz & Dagmara Nikulin, 2021. "Estimation of the size of informal employment based on administrative records with non‐ignorable selection mechanism," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(3), pages 667-690, June.
    13. Heng Chen & Geoffrey Dunbar & Q. Rallye Shen, 2020. "The Mode is the Message: Using Predata as Exclusion Restrictions to Evaluate Survey Design," Advances in Econometrics, in: Essays in Honor of Cheng Hsiao, volume 41, pages 341-357, Emerald Group Publishing Limited.
    14. Hildebrand Sean, 2015. "Coerced Confusion? Local Emergency Policy Implementation After September 11," Journal of Homeland Security and Emergency Management, De Gruyter, vol. 12(2), pages 273-298, June.
    15. Grewenig, Elisabeth & Lergetporer, Philipp & Simon, Lisa & Werner, Katharina & Woessmann, Ludger, 2018. "Can Online Surveys Represent the Entire Population?," IZA Discussion Papers 11799, Institute of Labor Economics (IZA).
    16. Knox, Melissa A. & Oddo, Vanessa M. & Walkinshaw, Lina Pinero & Jones-Smith, Jessica, 2020. "Is the public sweet on sugary beverages? Social desirability bias and sweetened beverage taxes," Economics & Human Biology, Elsevier, vol. 38(C).
    17. Arthur van Soest & Arie Kapteyn, 2009. "Mode and Context Effects in Measuring Household Assets," Working Papers 200949, Geary Institute, University College Dublin.
    18. Emmanuel O. Ogundimu, 2022. "Regularization and variable selection in Heckman selection model," Statistical Papers, Springer, vol. 63(2), pages 421-439, April.
    19. Lang, Megan & Ligon, Ethan, 2022. "SMS Surveys of Selected Expenditures," Department of Agricultural & Resource Economics, UC Berkeley, Working Paper Series qt7p7336h5, Department of Agricultural & Resource Economics, UC Berkeley.
    20. Magdalena Smyk & Joanna Tyrowicz & Lucas van der Velde, 2021. "A Cautionary Note on the Reliability of the Online Survey Data: The Case of Wage Indicator," Sociological Methods & Research, , vol. 50(1), pages 429-464, February.
    21. Matthias Schonlau & Arthur van Soest & Arie Kapteyn & Mick Couper, 2009. "Selection Bias in Web Surveys and the Use of Propensity Scores," Sociological Methods & Research, , vol. 37(3), pages 291-318, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:somere:v:40:y:2011:i:1:p:105-137. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.