IDEAS home Printed from https://ideas.repec.org/a/sae/somere/v44y2015i3p391-426.html

Measurement Error Calibration in Mixed-mode Sample Surveys

Author

Listed:
  • Bart Buelens
  • Jan A. van den Brakel

Abstract

Mixed-mode surveys are known to be susceptible to mode-dependent selection and measurement effects, collectively referred to as mode effects. The use of different data collection modes within the same survey may reduce selectivity of the overall response but is characterized by measurement errors differing across modes. Inference in sample surveys generally proceeds by correcting for selectivity—for example, by applying calibration estimators—and ignoring measurement error. When a survey is conducted repeatedly, such inferences are valid only if the measurement error remains constant between surveys. In sequential mixed-mode surveys, it is likely that the mode composition of the overall response differs between subsequent editions of the survey, leading to variations in the total measurement error and invalidating classical inferences. An approach to inference in these circumstances, which is based on calibrating the mode composition of the respondents toward fixed levels, is proposed. Assumptions and risks are discussed and explored in a simulation and applied to the Dutch crime victimization survey.

Suggested Citation

  • Bart Buelens & Jan A. van den Brakel, 2015. "Measurement Error Calibration in Mixed-mode Sample Surveys," Sociological Methods & Research, , vol. 44(3), pages 391-426, August.
  • Handle: RePEc:sae:somere:v:44:y:2015:i:3:p:391-426
    DOI: 10.1177/0049124114532444
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0049124114532444
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0049124114532444?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Toepoel, V. & Das, J.W.M. & van Soest, A.H.O., 2006. "Design of Web Questionnaires : The Effect of Layout in Rating Scales," Discussion Paper 2006-30, Tilburg University, Center for Economic Research.
    2. Vera Toepoel & Corrie Vis & Marcel Das & Arthur van Soest, 2009. "Design of Web Questionnaires," Sociological Methods & Research, , vol. 37(3), pages 371-392, February.
    3. Annette Jäckle & Caroline Roberts & Peter Lynn, 2010. "Assessing the Effect of Data Collection Mode on Measurement," International Statistical Review, International Statistical Institute, vol. 78(1), pages 3-20, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dana Garbarski & Nora Cate Schaeffer & Jennifer Dykema, 2019. "The Effects of Features of Survey Measurement on Self-Rated Health: Response Option Order and Scale Orientation," Applied Research in Quality of Life, Springer;International Society for Quality-of-Life Studies, vol. 14(2), pages 545-560, April.
    2. Brauner, Jacob, 2020. "Are Smileys Valid Answers? Survey Data Quality with Innovative Item Formats," SocArXiv dk9bc, Center for Open Science.
    3. Anna DeCastellarnau, 2018. "A classification of response scale characteristics that affect data quality: a literature review," Quality & Quantity: International Journal of Methodology, Springer, vol. 52(4), pages 1523-1559, July.
    4. Johannes W. S. Kappelhof & Edith D. De Leeuw, 2019. "Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response among Non-Western Minorities," Sociological Methods & Research, , vol. 48(1), pages 116-155, February.
    5. Zawojska, Ewa & Czajkowski, Mikotaj, "undated". "Are preferences stated in web vs. personal interviews different? A comparison of willingness to pay results for a large multi-country study of the Baltic Sea eutrophication reduction," Annual Meeting, 2017, June 18-21, Montreal, Canada 258604, Canadian Agricultural Economics Society.
    6. Lindhjem, Henrik & Navrud, Ståle, 2011. "Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes," International Review of Environmental and Resource Economics, now publishers, vol. 5(4), pages 309-351, September.
    7. Chatpong Tangmanee & Phattharaphong Niruttinanon, 2019. "Web Survey’s Completion Rates: Effects of Forced Responses, Question Display Styles, and Subjects’ Attitude," International Journal of Research in Business and Social Science (2147-4478), Center for the Strategic Studies in Business and Finance, vol. 8(1), pages 20-29, January.
    8. Jäckle, Annette & Roberts, Caroline, 2012. "Causes of mode effects: separating out interviewer and stimulus effects in comparisons of face-to-face and telephone surveys," ISER Working Paper Series 2012-27, Institute for Social and Economic Research.
    9. Wiśniowski, Arkadiusz & Bijak, Jakub & Forster, Jonathan J. & Smith, Peter W.F., 2019. "Hierarchical model for forecasting the outcomes of binary referenda," Computational Statistics & Data Analysis, Elsevier, vol. 133(C), pages 90-103.
    10. Natalja Menold & Vera Toepoel, 2024. "Do Different Devices Perform Equally Well with Different Numbers of Scale Points and Response Formats? A test of measurement invariance and reliability," Sociological Methods & Research, , vol. 53(2), pages 898-939, May.
    11. Grewenig, Elisabeth & Lergetporer, Philipp & Simon, Lisa & Werner, Katharina & Woessmann, Ludger, 2018. "Can Online Surveys Represent the Entire Population?," IZA Discussion Papers 11799, Institute of Labor Economics (IZA).
    12. Debra Wright & Matt Sloan & Kirsten Barrett, 2012. "Is There a Trade-off Between Quality and Cost? Telephone Versus Face-to-Face Interviewing of Persons with Disabilities," Mathematica Policy Research Reports cb6067df035641e99a913d534, Mathematica Policy Research.
    13. Skeie, Magnus Aa. & Lindhjem, Henrik & Skjeflo, Sofie & Navrud, Ståle, 2019. "Smartphone and tablet effects in contingent valuation web surveys – No reason to worry?," Ecological Economics, Elsevier, vol. 165(C), pages 1-1.
    14. de Bruijne, M.A., 2015. "Designing web surveys for the multi-device internet," Other publications TiSEM 19e4d446-a62b-4a95-8691-8, Tilburg University, School of Economics and Management.
    15. Mamine, Fateh & Fares, M'hand & Minviel, Jean Joseph, 2020. "Contract Design for Adoption of Agrienvironmental Practices: A Meta-analysis of Discrete Choice Experiments," Ecological Economics, Elsevier, vol. 176(C).
    16. Jorre T. A. Vannieuwenhuyze & Geert Loosveldt, 2013. "Evaluating Relative Mode Effects in Mixed-Mode Surveys:," Sociological Methods & Research, , vol. 42(1), pages 82-104, February.
    17. Tobias Gummer & Tanja Kunz, 2022. "Relying on External Information Sources When Answering Knowledge Questions in Web Surveys," Sociological Methods & Research, , vol. 51(2), pages 816-836, May.
    18. Neuert Cornelia E. & Roßmann Joss & Silber Henning, 2023. "Using Eye-Tracking Methodology to Study Grid Question Designs in Web Surveys," Journal of Official Statistics, Sciendo, vol. 39(1), pages 79-101, March.
    19. Rolf Becker, 2022. "The effects of a special sequential mixed-mode design, and reminders, on panellists’ participation in a probability-based panel study," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(1), pages 259-284, February.
    20. Toepoel, V. & Das, J.W.M. & van Soest, A.H.O., 2008. "Design Effects in Web Surveys : Comparing Trained and Fresh Respondents," Discussion Paper 2008-51, Tilburg University, Center for Economic Research.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:somere:v:44:y:2015:i:3:p:391-426. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.