Determinants of social desirability bias in sensitive surveys: a literature review
Survey questions asking about taboo topics such as sexual activities, illegal behaviour such as social fraud, or unsocial attitudes such as racism, often generate inaccurate survey estimates which are distorted by social desirability bias. Due to self-presentation concerns, survey respondents underreport socially undesirable activities and overreport socially desirable ones. This article reviews theoretical explanations of socially motivated misreporting in sensitive surveys and provides an overview of the empirical evidence on the effectiveness of specific survey methods designed to encourage the respondents to answer more honestly. Besides psychological aspects, like a stable need for social approval and the preference for not getting involved into embarrassing social interactions, aspects of the survey design, the interviewer’s characteristics and the survey situation determine the occurrence and the degree of social desirability bias. The review shows that survey designers could generate more valid data by selecting appropriate data collection strategies that reduce respondents’ discomfort when answering to a sensitive question. Copyright Springer Science+Business Media B.V. 2013
Volume (Year): 47 (2013)
Issue (Month): 4 (June)
|Contact details of provider:|| Web page: http://www.springer.com|
|Order Information:||Web: http://www.springer.com/economics/journal/11135|
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Steven D. Levitt & John A. List, 2007.
"What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?,"
Journal of Economic Perspectives,
American Economic Association, vol. 21(2), pages 153-174, Spring.
- Steven Levitt & John List, 2007. "What do Laboratory Experiments Measuring Social Preferences Reveal About the Real World," Artefactual Field Experiments 00480, The Field Experiments Website.
- Elisabeth Coutts & Ben Jann, 2011. "Sensitive Questions in Online Surveys: Experimental Results for the Randomized Response Technique (RRT) and the Unmatched Count Technique (UCT)," Sociological Methods & Research, SAGE Publishing, vol. 40(1), pages 169-193, February.
- Coutts Elisabethen & Jann Ben & Krumpal Ivar & Näher Anatol-Fiete, 2011. "Plagiarism in Student Papers: Prevalence Estimates Using Special Techniques for Sensitive Questions," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 231(5-6), pages 749-760, October.
- Elisabeth Coutts & Ben Jann, 2008. "Sensitive Questions in Online Surveys: Experimental Results for the Randomized Response Technique (RRT) and the Unmatched Count Technique (UCT)," ETH Zurich Sociology Working Papers 3, ETH Zurich, Chair of Sociology.
- Johannes Landsheer & Peter Van Der Heijden & Ger Van Gils, 1999. "Trust and Understanding, Two Psychological Aspects of Randomized Response," Quality & Quantity: International Journal of Methodology, Springer, vol. 33(1), pages 1-12, February.
- Edith de Leeuw, 2001. "Reducing Missing Data in Surveys: An Overview of Methods," Quality & Quantity: International Journal of Methodology, Springer, vol. 35(2), pages 147-160, May.
- Rolf Becker, 2006. "Selective Response to Questions on Delinquency," Quality & Quantity: International Journal of Methodology, Springer, vol. 40(4), pages 483-498, 08.
When requesting a correction, please mention this item's handle: RePEc:spr:qualqt:v:47:y:2013:i:4:p:2025-2047. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla)or (Rebekah McClure)
If references are entirely missing, you can add them using this form.