IDEAS home Printed from https://ideas.repec.org/p/hal/journl/hal-05391551.html

Field experiments: Overcoming the limitations of survey experiments for actionable behavioural insights

Author

Listed:
  • S. Dolnicar
  • G. Viglia

  • F. Kurtaliqi

    (Audencia Business School)

Abstract

Historically, one-off cross-sectional survey studies have dominated empirical research in tourism and hospitality. The inability to draw causal conclusions from such data has led to an increased uptake of survey experiments, which are easy and affordable to conduct and can identify causal relationships between constructs under controlled conditions. Survey experiments, however, have a severe limitation: they do not provide insights into real behaviour, restricting researchers' ability to generate actionable insights and reliable practical recommendations. This article offers a systematic comparison of three approaches (one-off cross-sectional survey studies, survey experiments, and field experiments) and provides step-by-step guidance on the design and implementation of field experiments and quasi-experimental field studies.

Suggested Citation

  • S. Dolnicar & G. Viglia & F. Kurtaliqi, 2026. "Field experiments: Overcoming the limitations of survey experiments for actionable behavioural insights," Post-Print hal-05391551, HAL.
  • Handle: RePEc:hal:journl:hal-05391551
    DOI: 10.1016/j.annals.2025.104080
    Note: View the original document on HAL open archive server: https://hal.science/hal-05391551v1
    as

    Download full text from publisher

    File URL: https://hal.science/hal-05391551v1/document
    Download Restriction: no

    File URL: https://libkey.io/10.1016/j.annals.2025.104080?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    2. Shadish, William R. & Clark, M. H. & Steiner, Peter M., 2008. "Can Nonrandomized Experiments Yield Accurate Answers? A Randomized Experiment Comparing Random and Nonrandom Assignments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1334-1344.
    3. Alem, Yonas & Eggert, Håkan & Kocher, Martin G. & Ruhinduka, Remidius D., 2018. "Why (field) experiments on unethical behavior are important: Comparing stated and revealed behavior," Journal of Economic Behavior & Organization, Elsevier, vol. 156(C), pages 71-85.
    4. repec:feb:artefa:0087 is not listed on IDEAS
    5. Rosenbaum, Paul R., 2007. "Interference Between Units in Randomized Experiments," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 191-200, March.
    6. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    7. Li, Fangxuan (Sam) & Su, Qianqian, 2024. "Influence of awe on tourism activity preferences," Annals of Tourism Research, Elsevier, vol. 107(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    2. Jose M. Fernandez, 2013. "An Empirical Model Of Learning Under Ambiguity: The Case Of Clinical Trials," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 54(2), pages 549-573, May.
    3. Vivian C. Wong & Peter M. Steiner, 2018. "Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings," Evaluation Review, , vol. 42(2), pages 176-213, April.
    4. Arthur Lewbel, 2019. "The Identification Zoo: Meanings of Identification in Econometrics," Journal of Economic Literature, American Economic Association, vol. 57(4), pages 835-903, December.
    5. Brendon McConnell & Marcos Vera-Hernandez, 2015. "Going beyond simple sample size calculations: a practitioner's guide," IFS Working Papers W15/17, Institute for Fiscal Studies.
    6. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    7. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    8. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2015. "What do we learn from public good games about voluntary climate action? Evidence from an artefactual field experiment," Working Papers 0595, University of Heidelberg, Department of Economics.
    9. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2020. "How much can we learn about voluntary climate action from behavior in public goods games?," Ecological Economics, Elsevier, vol. 171(C).
    10. Onur Altindag & Theodore J. Joyce & Julie A. Reeder, 2019. "Can Nonexperimental Methods Provide Unbiased Estimates of a Breastfeeding Intervention? A Within-Study Comparison of Peer Counseling in Oregon," Evaluation Review, , vol. 43(3-4), pages 152-188, June.
    11. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    12. Angus S. Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," NBER Working Papers 14690, National Bureau of Economic Research, Inc.
    13. Maria Cancian & Daniel R. Meyer & Robert G. Wood, 2022. "Do Carrots Work Better than Sticks? Results from the National Child Support Noncustodial Parent Employment Demonstration," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 41(2), pages 552-578, March.
    14. Baird, Matthew D. & Engberg, John & Gutierrez, Italo A., 2022. "RCT evidence on differential impact of US job training programmes by pre-training employment status," Labour Economics, Elsevier, vol. 75(C).
    15. Nicolaj N. Mühlbach, 2020. "Tree-based Synthetic Control Methods: Consequences of moving the US Embassy," CREATES Research Papers 2020-04, Department of Economics and Business Economics, Aarhus University.
    16. Sylvain Chassang & Erik Snowberg & Ben Seymour & Cayley Bowles, 2015. "Accounting for Behavior in Treatment Effects: New Applications for Blind Trials," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-13, June.
    17. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    18. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    19. Riddell, Chris & Riddell, W. Craig, 2016. "When Can Experimental Evidence Mislead? A Re-Assessment of Canada's Self Sufficiency Project," IZA Discussion Papers 9939, IZA Network @ LISER.
    20. Büttner, Thomas, 2008. "Ankündigungseffekt oder Maßnahmewirkung? Eine Evaluation von Trainingsmaßnahmen zur Überprüfung der Verfügbarkeit (Notification or participation : which treatment actually activates job-seekers? An evaluation of short-term training programmes)," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 41(1), pages 25-40.

    More about this item

    Keywords

    ;
    ;

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hal:journl:hal-05391551. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: CCSD (email available below). General contact details of provider: https://hal.archives-ouvertes.fr/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.