IDEAS home Printed from https://ideas.repec.org/p/bss/wpaper/18.html
   My bibliography  Save this paper

More Is Not Always Better: An Experimental Individual-Level Validation of the Randomized Response Technique and the Crosswise Model

Author

Listed:
  • Marc Höglinger

    ()

  • Ben Jann

    ()

Abstract

Social desirability and the fear of sanctions can deter survey respondents from responding truthfully to sensitive questions. Self-reports on norm breaking behavior such as shoplifting, non-voting, or tax evasion may therefore be subject to considerable misreporting. To mitigate such misreporting, various indirect techniques for asking sensitive questions, such as the randomized response technique (RRT), have been proposed in the literature. In our study, we evaluate the viability of several variants of the RRT, including the recently proposed crosswise-model RRT, by comparing respondents’ self-reports on cheating in dice games to actual cheating behavior, thereby distinguishing between false negatives (underreporting) and false positives (overreporting). The study has been implemented as an online survey on Amazon Mechanical Turk (N = 6,505). Our results indicate that the forced-response RRT and the unrelated-question RRT, as implemented in our survey, fail to reduce the level of misreporting compared to conventional direct questioning. For the crosswise-model RRT, we do observe a reduction of false negatives (that is, an increase in the proportion of cheaters who admit having cheated). At the same time, however, there is an increase in false positives (that is, an increase in non-cheaters who falsely admit having cheated). Overall, our findings suggest that none of the implemented sensitive questions techniques substantially outperforms direct questioning. Furthermore, our study demonstrates the importance of distinguishing false negatives and false positives when evaluating the validity of sensitive question techniques.

Suggested Citation

  • Marc Höglinger & Ben Jann, 2016. "More Is Not Always Better: An Experimental Individual-Level Validation of the Randomized Response Technique and the Crosswise Model," University of Bern Social Sciences Working Papers 18, University of Bern, Department of Social Sciences.
  • Handle: RePEc:bss:wpaper:18
    as

    Download full text from publisher

    File URL: http://repec.sowi.unibe.ch/files/wp18/Hoeglinger-Jann-2016-MTurk.pdf
    File Function: working paper
    Download Restriction: no

    File URL: http://repec.sowi.unibe.ch/files/wp18/Hoeglinger-Jann-2016-MTurk-Analysis.pdf
    File Function: documentation of analysis (log files)
    Download Restriction: no

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Jun-Wu Yu & Guo-Liang Tian & Man-Lai Tang, 2008. "Two new models for survey sampling with sensitive characteristic: design and analysis," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 67(3), pages 251-263, April.
    3. Kundt, Thorben, 2014. "Applying “Benford’s law” to the Crosswise Model: Findings from an online survey on tax evasion," Working Paper 148/2014, Helmut Schmidt University, Hamburg.
    4. Urs Fischbacher & Franziska Föllmi-Heusi, 2013. "Lies In Disguise—An Experimental Study On Cheating," Journal of the European Economic Association, European Economic Association, vol. 11(3), pages 525-547, June.
    5. Elisabeth Coutts & Ben Jann, 2011. "Sensitive Questions in Online Surveys: Experimental Results for the Randomized Response Technique (RRT) and the Unmatched Count Technique (UCT)," Sociological Methods & Research, , vol. 40(1), pages 169-193, February.
    6. Ulf Böckenholt & Sema Barlas & Peter G. M. van der Heijden, 2009. "Do randomized‐response designs eliminate response biases? An empirical study of non‐compliance behavior," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 24(3), pages 377-392, April.
    7. Marc Höglinger & Ben Jann & Andreas Diekmann, 2014. "Sensitive Questions in Online Surveys: An Experimental Evaluation of the Randomized Response Technique and the Crosswise Model," University of Bern Social Sciences Working Papers 9, University of Bern, Department of Social Sciences, revised 24 Jun 2014.
    8. Korndörfer, Martin & Krumpal, Ivar & Schmukle, Stefan C., 2014. "Measuring and explaining tax evasion: Improving self-reports using the crosswise model," Journal of Economic Psychology, Elsevier, vol. 45(C), pages 18-32.
    9. Andreas Diekmann, 2012. "Making Use of “Benford’s Law†for the Randomized Response Technique," Sociological Methods & Research, , vol. 41(2), pages 325-334, May.
    10. Kundt, Thorben C. & Misch, Florian & Nerré, Birger, 2013. "Re-assessing the merits of measuring tax evasions through surveys: Evidence from Serbian firms," ZEW Discussion Papers 13-047, ZEW - Zentrum für Europäische Wirtschaftsforschung / Center for European Economic Research.
    11. Kirchner Antje, 2015. "Validating Sensitive Questions: A Comparison of Survey and Register Data," Journal of Official Statistics, Sciendo, vol. 31(1), pages 31-59, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Marc Höglinger & Andreas Diekmann, 2016. "Uncovering a Blind Spot in Sensitive Question Research: False Positives Undermine the Crosswise-Model RRT," University of Bern Social Sciences Working Papers 24, University of Bern, Department of Social Sciences.

    More about this item

    Keywords

    Sensitive Questions; Online Survey; Amazon Mechanical Turk; Randomized Response Technique; Crosswise Model; Dice Game; Validation;

    JEL classification:

    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bss:wpaper:18. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Ben Jann). General contact details of provider: http://www.sowi.unibe.ch/ .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.