IDEAS home Printed from https://ideas.repec.org/a/sae/somere/v48y2019i1p116-155.html
   My bibliography  Save this article

Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response among Non-Western Minorities

Author

Listed:
  • Johannes W. S. Kappelhof
  • Edith D. De Leeuw

Abstract

This study investigates the impact of different modes and tailor-made response enhancing measures (TMREM)—such as bilingual interviewers with a shared ethnic background and translated questionnaires—on the measurement of substantive variables in surveys among minority ethnic groups in the Netherlands. It also provides insight into the ability to detect mode measurement effects of a recently developed method for disentangling mode measurement and mode selection effects, as well as into the tenability of the assumptions underlying this method. The data used in this study come from a large-scale survey design experiment among the four largest non-Western minority ethnic groups in the Netherlands comparing single-mode computer-assisted personal interviewing (CAPI) and sequential computer-assisted web interviewing, computer-assisted telephone interviewing, and CAPI-MM. The number and intensity of the TMREM varied among the four ethnic groups. The results show that mode measurement effects occur among all ethnic groups and are the result of a combination of the presence or absence of an interviewer and TMREM. Mode measurement effects occur more often on sociocultural questions, but also, on occasion, on more sociostructural or background questions. The method used to disentangle mode measurement and mode selection effects can be applied to detect mode measurement effects, but one should be cautious in interpreting them. Implausible mode measurement effects can be caused by the violation of the assumptions underlying this method.

Suggested Citation

  • Johannes W. S. Kappelhof & Edith D. De Leeuw, 2019. "Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response among Non-Western Minorities," Sociological Methods & Research, , vol. 48(1), pages 116-155, February.
  • Handle: RePEc:sae:somere:v:48:y:2019:i:1:p:116-155
    DOI: 10.1177/0049124117701474
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0049124117701474
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0049124117701474?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Annette Jäckle & Caroline Roberts & Peter Lynn, 2010. "Assessing the Effect of Data Collection Mode on Measurement," International Statistical Review, International Statistical Institute, vol. 78(1), pages 3-20, April.
    2. Jorre T. A. Vannieuwenhuyze & Geert Loosveldt & Geert Molenberghs, 2012. "A Method to Evaluate Mode Effects on the Mean and Variance of a Continuous Variable in Mixed-Mode Surveys," International Statistical Review, International Statistical Institute, vol. 80(2), pages 306-322, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zawojska, Ewa & Czajkowski, Mikotaj, 2017. "Are preferences stated in web vs. personal interviews different? A comparison of willingness to pay results for a large multi-country study of the Baltic Sea eutrophication reduction," Annual Meeting, 2017, June 18-21, Montreal, Canada 258604, Canadian Agricultural Economics Society.
    2. Debra Wright & Matt Sloan & Kirsten Barrett, 2012. "Is There a Trade-off Between Quality and Cost? Telephone Versus Face-to-Face Interviewing of Persons with Disabilities," Mathematica Policy Research Reports cb6067df035641e99a913d534, Mathematica Policy Research.
    3. Mamine, Fateh & Fares, M'hand & Minviel, Jean Joseph, 2020. "Contract Design for Adoption of Agrienvironmental Practices: A Meta-analysis of Discrete Choice Experiments," Ecological Economics, Elsevier, vol. 176(C).
    4. Jorre T. A. Vannieuwenhuyze & Geert Loosveldt, 2013. "Evaluating Relative Mode Effects in Mixed-Mode Surveys:," Sociological Methods & Research, , vol. 42(1), pages 82-104, February.
    5. Guimarães, Maria Helena & Nunes, Luís Catela & Madureira, Lívia & Santos, José Lima & Boski, Tomasz & Dentinho, Tomaz, 2015. "Measuring birdwatchers preferences: A case for using online networks and mixed-mode surveys," Tourism Management, Elsevier, vol. 46(C), pages 102-113.
    6. Mackeben, Jan, 2020. "Mode Effects in the Fourth Wave of the Linked Personnel Panel (LPP) Employee Survey," FDZ Methodenreport 202005_en, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany].
    7. Grewenig, Elisabeth & Lergetporer, Philipp & Simon, Lisa & Werner, Katharina & Woessmann, Ludger, 2018. "Can Online Surveys Represent the Entire Population?," IZA Discussion Papers 11799, Institute of Labor Economics (IZA).
    8. repec:iab:iabfme:202005(en is not listed on IDEAS
    9. Cernat, Alexandru, 2014. "Impact of mixed modes on measurement errors and estimates of change in panel data," Understanding Society Working Paper Series 2014-05, Understanding Society at the Institute for Social and Economic Research.
    10. repec:mpr:mprres:7332 is not listed on IDEAS
    11. Bart Buelens & Jan A. van den Brakel, 2015. "Measurement Error Calibration in Mixed-mode Sample Surveys," Sociological Methods & Research, , vol. 44(3), pages 391-426, August.
    12. Pirmin Fessler & Maximilian Kasy & Peter Lindner, 2018. "Survey mode effects on measured income inequality," The Journal of Economic Inequality, Springer;Society for the Study of Economic Inequality, vol. 16(4), pages 487-505, December.
    13. Jäckle, Annette & Roberts, Caroline, 2012. "Causes of mode effects: separating out interviewer and stimulus effects in comparisons of face-to-face and telephone surveys," ISER Working Paper Series 2012-27, Institute for Social and Economic Research.
    14. Wiśniowski, Arkadiusz & Bijak, Jakub & Forster, Jonathan J. & Smith, Peter W.F., 2019. "Hierarchical model for forecasting the outcomes of binary referenda," Computational Statistics & Data Analysis, Elsevier, vol. 133(C), pages 90-103.
    15. Skeie, Magnus Aa. & Lindhjem, Henrik & Skjeflo, Sofie & Navrud, Ståle, 2019. "Smartphone and tablet effects in contingent valuation web surveys – No reason to worry?," Ecological Economics, Elsevier, vol. 165(C), pages 1-1.
    16. Rolf Becker, 2022. "The effects of a special sequential mixed-mode design, and reminders, on panellists’ participation in a probability-based panel study," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(1), pages 259-284, February.
    17. Wenz, Alexander, 2017. "Completing web surveys on mobile devices: does screen size affect data quality?," ISER Working Paper Series 2017-05, Institute for Social and Economic Research.
    18. de Leeuw, E.D. & Hox, J.J.C.M. & Scherpenzeel, A.C., 2011. "Mode effect or question wording? Measurement error in mixed mode surveys," Other publications TiSEM 4218c762-6d80-4dfc-97ee-8, Tilburg University, School of Economics and Management.
    19. Bernhard Schimpl-Neimanns, 2013. "Methodische Herausforderungen bei der Erfassung von Bildung und Ausbildung im Mikrozensus," RatSWD Working Papers 221, German Data Forum (RatSWD).
    20. Giorgio Piccitto & Aart C. Liefbroer & Tom Emery, 2022. "Does the Survey Mode Affect the Association Between Subjective Well-being and its Determinants? An Experimental Comparison Between Face-to-Face and Web Mode," Journal of Happiness Studies, Springer, vol. 23(7), pages 3441-3461, October.
    21. Vannieuwenhuyze, Jorre T.A. & Lynn, Peter, 2014. "Measurement effects between CAPI and Web questionnaires in the UK Household Longitudinal Study," Understanding Society Working Paper Series 2014-01, Understanding Society at the Institute for Social and Economic Research.
    22. Tristram R. Ingham & Bernadette Jones & Meredith Perry & Martin von Randow & Barry Milne & Paula T. King & Linda W. Nikora & Andrew Sporle & Te Ao Mārama Study Group, 2023. "Measuring Māori Health, Wellbeing, and Disability in Aotearoa Using a Web-Based Survey Methodology," IJERPH, MDPI, vol. 20(18), pages 1-30, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:somere:v:48:y:2019:i:1:p:116-155. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.