IDEAS home Printed from https://ideas.repec.org/p/osf/socarx/bc7qn_v1.html

Mode effects on survey item measurement: A systematic review of the experimental evidence

Author

Listed:
  • Tomova, Georgia D
  • Silverwood, Richard J.
  • Wright, Liam

Abstract

Survey data are increasingly collected using mixed-mode designs. However, the measurement of survey items may differ across modes, introducing ‘mode effects’, a type of systematic measurement error which can bias analyses of mixed-mode data. While the theoretical mechanisms giving rise to mode effects have been discussed in detail, the empirical evidence on their occurrence and size is fragmented. In addition, while many existing statistical approaches for handling mode effects require unrealistic assumptions, other more suitable approaches remain underutilised due to the need for external evidence on the magnitude of mode effects. To address this, we conducted a systematic review of the experimental literature on mode effects. We searched multiple bibliographic databases, grey literature sources, and implemented backwards and forwards citation screening. Studies eligible for inclusion were (quasi-)experimental, sampled from the general population (or age-, sex-, region-specific strata), and reported mode effect estimates on item measurement. We extracted comprehensive information relating to the study design, sampling, mode effect estimates, and reporting. Ninety experimental studies published between 1967 and 2024 met the inclusion criteria, which included 4,113 mode effect estimates for 3,545 unique variables in total. Mode effects were generally small, typically below 0.2 SD. However, larger mode effects were more commonly observed when modes differed by interviewer involvement or by question delivery (visual vs aural), as well as for sensitive items (e.g., sexual behaviour, social life), which aligns with pre-existing theory on the causes of mode effects. Generally, where mode effects occur, they are item-, mode-, and population-specific. Reporting quality varied substantially and insufficient details regarding randomisation compliance, non- response, and uncertainty of estimates were common. We collated all mode effect estimates into a free online database and provide a set of recommendations to improve the reporting of future studies.

Suggested Citation

  • Tomova, Georgia D & Silverwood, Richard J. & Wright, Liam, 2026. "Mode effects on survey item measurement: A systematic review of the experimental evidence," SocArXiv bc7qn_v1, Center for Open Science.
  • Handle: RePEc:osf:socarx:bc7qn_v1
    DOI: 10.31219/osf.io/bc7qn_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/695d48d55d51b4379906c45a/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/bc7qn_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Almut Schumann & Detlev Lück, 2023. "Better to ask online when it concerns intimate relationships? Survey mode differences in the assessment of relationship quality," Demographic Research, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 48(22), pages 609-640.
    2. Lena Wettergren & Elisabet Mattsson & Louise von Essen, 2011. "Mode of administration only has a small effect on data quality and self‐reported health status and emotional distress among Swedish adolescents and young adults," Journal of Clinical Nursing, John Wiley & Sons, vol. 20(11‐12), pages 1568-1577, June.
    3. Giorgio Piccitto & Aart C. Liefbroer & Tom Emery, 2022. "Does the Survey Mode Affect the Association Between Subjective Well-being and its Determinants? An Experimental Comparison Between Face-to-Face and Web Mode," Journal of Happiness Studies, Springer, vol. 23(7), pages 3441-3461, October.
    4. Mingnan Liu & Frederick G. Conrad & Sunghee Lee, 2017. "Comparing acquiescent and extreme response styles in face-to-face and web surveys," Quality & Quantity: International Journal of Methodology, Springer, vol. 51(2), pages 941-958, March.
    5. Brooke Helppie-McFall & Joanne W. Hsu, 2017. "A test of web and mail mode effects in a financially sensitive survey of older Americans," Journal of Economic and Social Measurement, IOS Press, issue 2, pages 151-169.
    6. Annette Jäckle & Caroline Roberts & Peter Lynn, 2010. "Assessing the Effect of Data Collection Mode on Measurement," International Statistical Review, International Statistical Institute, vol. 78(1), pages 3-20, April.
    7. Tellis, Gerard J. & Chandrasekaran, Deepa, 2010. "Extent and impact of response biases in cross-national survey research," International Journal of Research in Marketing, Elsevier, vol. 27(4), pages 329-341.
    8. Caeyers, Bet & Chalmers, Neil & De Weerdt, Joachim, 2012. "Improving consumption measurement and other survey data through CAPI: Evidence from a randomized experiment," Journal of Development Economics, Elsevier, vol. 98(1), pages 19-33.
    9. Reisinger, James, 2022. "Subjective well-being and social desirability," Journal of Public Economics, Elsevier, vol. 214(C).
    10. Feng, Shihan & Huang, Feng, 2024. "Does survey mode matter? An experimental evaluation of data quality in China," China Economic Review, Elsevier, vol. 88(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zawojska, Ewa & Czajkowski, Mikotaj, "undated". "Are preferences stated in web vs. personal interviews different? A comparison of willingness to pay results for a large multi-country study of the Baltic Sea eutrophication reduction," Annual Meeting, 2017, June 18-21, Montreal, Canada 258604, Canadian Agricultural Economics Society.
    2. Böhme, Marcus & Stöhr, Tobias, 2012. "Guidelines for the use of household interview duration analysis in CAPI survey management," Kiel Working Papers 1779, Kiel Institute for the World Economy.
    3. Debra Wright & Matt Sloan & Kirsten Barrett, 2012. "Is There a Trade-off Between Quality and Cost? Telephone Versus Face-to-Face Interviewing of Persons with Disabilities," Mathematica Policy Research Reports cb6067df035641e99a913d534, Mathematica Policy Research.
    4. Jorre T. A. Vannieuwenhuyze & Geert Loosveldt, 2013. "Evaluating Relative Mode Effects in Mixed-Mode Surveys:," Sociological Methods & Research, , vol. 42(1), pages 82-104, February.
    5. Mackeben, Jan, 2020. "Mode Effects in the Fourth Wave of the Linked Personnel Panel (LPP) Employee Survey," FDZ Methodenreport 202005_en, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany].
    6. Giorgio Piccitto & Hans M. A. Schadee & Gabriele Ballarino, 2023. "Job Satisfaction and Gender in Italy: A Structural Equation Modeling Approach," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 169(3), pages 775-793, October.
    7. Joachim De Weerdt & Kalle Hirvonen, 2016. "Risk Sharing and Internal Migration," Economic Development and Cultural Change, University of Chicago Press, vol. 65(1), pages 63-86.
    8. Leonardo Bursztyn & Ingar Haaland & Nicolas Röver & Christopher Roth, 2025. "The Social Desirability Atlas," ECONtribute Discussion Papers Series 365, University of Bonn and University of Cologne, Germany.
    9. Feng, Shihan & Huang, Feng, 2024. "Does survey mode matter? An experimental evaluation of data quality in China," China Economic Review, Elsevier, vol. 88(C).
    10. Grewenig, Elisabeth & Lergetporer, Philipp & Simon, Lisa & Werner, Katharina & Woessmann, Ludger, 2018. "Can Online Surveys Represent the Entire Population?," IZA Discussion Papers 11799, IZA Network @ LISER.
    11. Joachim De Weerdt & Kathleen Beegle & Jed Friedman & John Gibson, 2016. "The Challenge of Measuring Hunger through Survey," Economic Development and Cultural Change, University of Chicago Press, vol. 64(4), pages 727-758.
    12. Xavier Cirera & Diego A. Comin & Marcio Cruz & Kyung Min Lee, 2020. "Anatomy of Technology in the Firm," NBER Working Papers 28080, National Bureau of Economic Research, Inc.
    13. Bart Buelens & Jan A. van den Brakel, 2015. "Measurement Error Calibration in Mixed-mode Sample Surveys," Sociological Methods & Research, , vol. 44(3), pages 391-426, August.
    14. Rao, Lakshman Nagraj & Gentile, Elisabetta & Pipon, Dave & Roque, Jude David & Thuy, Vu Thi Thu, 2020. "The impact of computer-assisted personal interviewing on survey duration, quality, and cost: Evidence from the Viet Nam Labor Force Survey," GLO Discussion Paper Series 605, Global Labor Organization (GLO).
    15. Abate, Gashaw T. & de Brauw, Alan & Hirvonen, Kalle & Wolle, Abdulazize, 2023. "Measuring consumption over the phone: Evidence from a survey experiment in urban Ethiopia," Journal of Development Economics, Elsevier, vol. 161(C).
    16. Christensen, Cheryl, . "Progress and Challenges in Global Food Security," Amber Waves:The Economics of Food, Farming, Natural Resources, and Rural America, United States Department of Agriculture, Economic Research Service, vol. 0(01).
    17. Dolnicar, Sara & Grün, Bettina & Leisch, Friedrich, 2016. "Increasing sample size compensates for data problems in segmentation studies," Journal of Business Research, Elsevier, vol. 69(2), pages 992-999.
    18. Yingbin Zhang & Zhaoxi Yang & Yehui Wang, 2022. "The Impact of Extreme Response Style on the Mean Comparison of Two Independent Samples," SAGE Open, , vol. 12(2), pages 21582440221, June.
    19. Di Francesco, Riccardo & Mellace, Giovanni, 2025. "Causal inference for qualitative outcomes," Economics Letters, Elsevier, vol. 256(C).
    20. Emmanuel Nshakira-Rukundo & Essa Chanie Mussa & Nathan Nshakira & Nicolas Gerber & Joachim von Braun, 2021. "Impact of community-based health insurance on utilisation of preventive health services in rural Uganda: a propensity score matching approach," International Journal of Health Economics and Management, Springer, vol. 21(2), pages 203-227, June.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:socarx:bc7qn_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://arabixiv.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.