IDEAS home Printed from https://ideas.repec.org/a/vrs/offsta/v33y2017i3p625-657n4.html
   My bibliography  Save this article

Dynamic Question Ordering in Online Surveys

Author

Listed:
  • Early Kirstin

    (Machine Learning Department, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA, 15213, United States of America.)

  • Mankoff Jennifer

    (Human-Computer Interaction Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA, 15213, United States of America.)

  • Fienberg Stephen E.

    (Department of Statistics, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA, 15213, United States of America.)

Abstract

Online surveys have the potential to support adaptive questions, where later questions depend on earlier responses. Past work has taken a rule-based approach, uniformly across all respondents. We envision a richer interpretation of adaptive questions, which we call Dynamic Question Ordering (DQO), where question order is personalized. Such an approach could increase engagement, and therefore response rate, as well as imputation quality. We present a DQO framework to improve survey completion and imputation. In the general survey-taking setting, we want to maximize survey completion, and so we focus on ordering questions to engage the respondent and collect hopefully all information, or at least the information that most characterizes the respondent, for accurate imputations. In another scenario, our goal is to provide a personalized prediction. Since it is possible to give reasonable predictions with only a subset of questions, we are not concerned with motivating users to answer all questions. Instead, we want to order questions to get information that reduces prediction uncertainty, while not being too burdensome. We illustrate this framework with two case studies, for the prediction and survey-taking settings. We also discuss DQO for national surveys and consider connections between our statistics-based question-ordering approach and cognitive survey methodology.

Suggested Citation

  • Early Kirstin & Mankoff Jennifer & Fienberg Stephen E., 2017. "Dynamic Question Ordering in Online Surveys," Journal of Official Statistics, Sciendo, vol. 33(3), pages 625-657, September.
  • Handle: RePEc:vrs:offsta:v:33:y:2017:i:3:p:625-657:n:4
    DOI: 10.1515/jos-2017-0030
    as

    Download full text from publisher

    File URL: https://doi.org/10.1515/jos-2017-0030
    Download Restriction: no

    File URL: https://libkey.io/10.1515/jos-2017-0030?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lu Wang & Andrea Rotnitzky & Xihong Lin & Randall E. Millikan & Peter F. Thall, 2012. "Evaluation of Viable Dynamic Treatment Regimes in a Sequentially Randomized Trial of Advanced Prostate Cancer," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(498), pages 493-508, June.
    2. Robert M. Groves & Steven G. Heeringa, 2006. "Responsive design for household surveys: tools for actively controlling survey errors and costs," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 169(3), pages 439-457, July.
    3. Montgomery, Jacob M. & Cutler, Josh, 2013. "Computerized Adaptive Testing for Public Opinion Surveys," Political Analysis, Cambridge University Press, vol. 21(2), pages 172-192, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matthew J Salganik & Karen E C Levy, 2015. "Wiki Surveys: Open and Quantifiable Social Data Collection," PLOS ONE, Public Library of Science, vol. 10(5), pages 1-17, May.
    2. Chun Asaph Young & Schouten Barry & Wagner James, 2017. "JOS Special Issue on Responsive and Adaptive Survey Design: Looking Back to See Forward – Editorial: In Memory of Professor Stephen E. Fienberg, 1942–2016," Journal of Official Statistics, Sciendo, vol. 33(3), pages 571-577, September.
    3. Reza C. Daniels, 2012. "A Framework for Investigating Micro Data Quality, with Application to South African Labour Market Household Surveys," SALDRU Working Papers 90, Southern Africa Labour and Development Research Unit, University of Cape Town.
    4. Reist, Benjamin M. & Rodhouse, Joseph B. & Ball, Shane T. & Young, Linda J., 2019. "Subsampling of Nonrespondents in the 2017 Census of Agriculture," NASS Research Reports 322826, United States Department of Agriculture, National Agricultural Statistics Service.
    5. Lewis Taylor, 2017. "Univariate Tests for Phase Capacity: Tools for Identifying When to Modify a Survey’s Data Collection Protocol," Journal of Official Statistics, Sciendo, vol. 33(3), pages 601-624, September.
    6. Jiayun Jin & Caroline Vandenplas & Geert Loosveldt, 2019. "The Evaluation of Statistical Process Control Methods to Monitor Interview Duration During Survey Data Collection," SAGE Open, , vol. 9(2), pages 21582440198, June.
    7. Andy Peytchev, 2013. "Consequences of Survey Nonresponse," The ANNALS of the American Academy of Political and Social Science, , vol. 645(1), pages 88-111, January.
    8. Roger Tourangeau & J. Michael Brick & Sharon Lohr & Jane Li, 2017. "Adaptive and responsive survey designs: a review and assessment," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(1), pages 203-223, January.
    9. repec:iab:iabfda:201307(en is not listed on IDEAS
    10. Roberts Caroline & Vandenplas Caroline & Herzing Jessica M.E., 2020. "A Validation of R-Indicators as a Measure of the Risk of Bias using Data from a Nonresponse Follow-Up Survey," Journal of Official Statistics, Sciendo, vol. 36(3), pages 675-701, September.
    11. Böhme, Marcus & Stöhr, Tobias, 2012. "Guidelines for the use of household interview duration analysis in CAPI survey management," Kiel Working Papers 1779, Kiel Institute for the World Economy (IfW Kiel).
    12. Thomas A. Murray & Peter F. Thall & Ying Yuan & Sarah McAvoy & Daniel R. Gomez, 2017. "Robust Treatment Comparison Based on Utilities of Semi-Competing Risks in Non-Small-Cell Lung Cancer," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(517), pages 11-23, January.
    13. Mario Callegaro & Charlotte Steeh & Trent D. Buskirk & Vasja Vehovar & Vesa Kuusela & Linda Piekarski, 2007. "Fitting disposition codes to mobile phone surveys: experiences from studies in Finland, Slovenia and the USA," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 647-670, July.
    14. Markus Frölich & Martin Huber, 2014. "Treatment Evaluation With Multiple Outcome Periods Under Endogeneity and Attrition," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(508), pages 1697-1711, December.
    15. Durrant Gabriele B. & Maslovskaya Olga & Smith Peter W. F., 2017. "Using Prior Wave Information and Paradata: Can They Help to Predict Response Outcomes and Call Sequence Length in a Longitudinal Study?," Journal of Official Statistics, Sciendo, vol. 33(3), pages 801-833, September.
    16. Raphael Nishimura & James Wagner & Michael Elliott, 2016. "Alternative Indicators for the Risk of Non-response Bias: A Simulation Study," International Statistical Review, International Statistical Institute, vol. 84(1), pages 43-62, April.
    17. Holly Matulewicz & Eric Grau & Arif Mamun & Gina Livermore, "undated". "Promoting Readiness of Minors in Supplemental Security Income (PROMISE): PROMISE 60-Month Sampling and Survey Plan," Mathematica Policy Research Reports be402161c12e402392af9182e, Mathematica Policy Research.
    18. G. Blom, Annelies, 2008. "Measuring nonresponse cross-nationally," ISER Working Paper Series 2008-41, Institute for Social and Economic Research.
    19. Sofie Marien & Marc Hooghe & Ellen Quintelier, 2010. "Inequalities in Non‐institutionalised Forms of Political Participation: A Multi‐level Analysis of 25 countries," Political Studies, Political Studies Association, vol. 58(1), pages 187-213, February.
    20. Lipps Oliver & Voorpostel Marieke, 2020. "Can Interviewer Evaluations Predict Short-Term and Long-Term Participation in Telephone Panels?," Journal of Official Statistics, Sciendo, vol. 36(1), pages 117-136, March.
    21. Willems, Jurgen, 2015. "Individual perceptions on the participant and societal functionality of non-formal education for youth: Explaining differences across countries based on the human development index," International Journal of Educational Development, Elsevier, vol. 44(C), pages 11-20.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:vrs:offsta:v:33:y:2017:i:3:p:625-657:n:4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.sciendo.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.