IDEAS home Printed from https://ideas.repec.org/a/kap/enreec/v73y2019i3d10.1007_s10640-018-0289-x.html
   My bibliography  Save this article

Subject Pools and Deception in Agricultural and Resource Economics Experiments

Author

Listed:
  • Timothy N. Cason

    () (Purdue University)

  • Steven Y. Wu

    () (Purdue University)

Abstract

Abstract The use of student subjects and deception in experiments are two controversial issues that often raise concerns among editors and reviewers, which might prevent quality research from being published in agricultural and resource economics (ARE) journals. We provide a self-contained methodological discussion of these issues. We argue that field professionals are the most appropriate subjects for questions related to policy or measurement, and students are the most appropriate subjects for scientific research questions closely tied to economic theory. Active deception, where subjects are provided with explicitly misleading information, has been avoided in the mainstream economics discipline because it can lead to a loss of experimental control, lead to subject selection bias, and impose negative externalities on other researchers. Disciplinary ARE journals may want to abide by these norms against deception to maintain credibility. Interdisciplinary ARE journals may have more flexibility, although it is important to provide guidelines to avoid too much reviewer-specific variation in standards. For ARE researchers, we suggest employing a deception-free experimental design whenever possible because we know of no field in which deception is encouraged.

Suggested Citation

  • Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
  • Handle: RePEc:kap:enreec:v:73:y:2019:i:3:d:10.1007_s10640-018-0289-x
    DOI: 10.1007/s10640-018-0289-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10640-018-0289-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Higgins, Nathaniel & Hellerstein, Daniel & Wallander, Steven & Lynch, Lori, 2017. "Economic Experiments for Policy Analysis and Program Design: A Guide for Agricultural Decisionmakers," Economic Research Report 262456, United States Department of Agriculture, Economic Research Service.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Andreas Ortmann & Ralph Hertwig, 2002. "The Costs of Deception: Evidence from Psychology," Experimental Economics, Springer;Economic Science Association, vol. 5(2), pages 111-131, October.
    4. Jamison, Julian & Karlan, Dean & Schechter, Laura, 2008. "To deceive or not to deceive: The effect of deception on behavior in future laboratory experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 68(3-4), pages 477-488, December.
    5. Gary Charness & Marie-Claire Villeval, 2009. "Cooperation and Competition in Intergenerational Experiments in the Field and the Laboratory," American Economic Review, American Economic Association, vol. 99(3), pages 956-978, June.
    6. Armin Falk & James J. Heckman, 2009. "Lab Experiments are a Major Source of Knowledge in the Social Sciences," Working Papers 200935, Geary Institute, University College Dublin.
    7. Gregory Colson & Jay R. Corrigan & Carola Grebitus & Maria L. Loureiro & Matthew C. Rousu, 2016. "Which Deceptive Practices, If Any, Should Be Allowed in Experimental Economics Research? Results from Surveys of Applied Experimental Economists and Students," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 98(2), pages 610-621.
    8. Gary Charness & Marie-Claire Villeval, 2009. "Cooperation and Competition in Intergenerational Experiments in the Field and the Laboratory," American Economic Review, American Economic Association, vol. 99(3), pages 956-978, June.
    9. Andersen, Steffen & Harrison, Glenn W. & Lau, Morten Igel & Rutström, E. Elisabet, 2010. "Preference heterogeneity in experiments: Comparing the field and laboratory," Journal of Economic Behavior & Organization, Elsevier, vol. 73(2), pages 209-224, February.
    10. David H. Herberich & John A. List, 2012. "Digging into Background Risk: Experiments with Farmers and Students," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 94(2), pages 457-463.
    11. Brian E. Roe, 2015. "The Risk Attitudes of U.S. Farmers," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 37(4), pages 553-574.
    12. Julianna M. Butler & Christian A. Vossler, 2018. "What is an Unregulated and Potentially Misleading Label Worth? The case of “Natural”-Labelled Groceries," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 70(2), pages 545-564, June.
    13. Vivi Alatas & Lisa Cameron & Ananish Chaudhuri & Nisvan Erkal & Lata Gangadharan, 2009. "Subject pool effects in a corruption experiment: A comparison of Indonesian public servants and Indonesian students," Experimental Economics, Springer;Economic Science Association, vol. 12(1), pages 113-132, March.
    14. Cooper, David J., 2014. "A Note on Deception in Economic Experiments," Journal of Wine Economics, Cambridge University Press, vol. 9(02), pages 111-114, August.
    15. Toke R. Fosgaard, 2018. "Cooperation stability: A representative sample in the lab," IFRO Working Paper 2018/08, University of Copenhagen, Department of Food and Resource Economics.
    16. Jeffrey Carpenter & Erika Seki, 2011. "Do Social Preferences Increase Productivity? Field Experimental Evidence From Fishermen In Toyama Bay," Economic Inquiry, Western Economic Association International, vol. 49(2), pages 612-630, April.
    17. David Cooper & John Kagel & Qing Liang Gu & Wei Lo, 1999. "Gaming against managers in incentive systems: Experimental results with chinese students and chinese managers," Artefactual Field Experiments 00038, The Field Experiments Website.
    18. Ignacio Palacios-Huerta & Oscar Volij, 2008. "Experientia Docet: Professionals Play Minimax in Laboratory Experiments," Econometrica, Econometric Society, vol. 76(1), pages 71-115, January.
    19. Joseph Henrich & Steve J. Heine & Ara Norenzayan, 2010. "The Weirdest People in the World?," Working Paper Series of the German Council for Social and Economic Data 139, German Council for Social and Economic Data (RatSWD).
    20. Laure Kuhfuss & Raphaële Préget & Sophie Thoyer & Nick Hanley, 2016. "Nudging farmers to enrol land into agri-environmental schemes: the role of a collective bonus," European Review of Agricultural Economics, Foundation for the European Review of Agricultural Economics, vol. 43(4), pages 609-636.
    21. John Wooders, 2010. "Does Experience Teach? Professionals and Minimax Play in the Lab," Econometrica, Econometric Society, vol. 78(3), pages 1143-1154, May.
    22. Vernon L. Smith, 1962. "An Experimental Study of Competitive Market Behavior," Journal of Political Economy, University of Chicago Press, vol. 70, pages 111-111.
    23. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    24. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    25. Arnab Mitra & Michael R. Moore, 2018. "Green Electricity Markets as Mechanisms of Public-Goods Provision: Theory and Experimental Evidence," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 71(1), pages 45-71, September.
    26. Laure Kuhfuss & Raphaële Préget & Sophie Thoyer & Nick Hanley, 2016. "Nudging farmers to enrol land into agri-environmental schemes: the role of a collective bonus," European Review of Agricultural Economics, Foundation for the European Review of Agricultural Economics, vol. 43(4), pages 609-636.
    27. Smith, Vernon L, 1982. "Microeconomic Systems as an Experimental Science," American Economic Review, American Economic Association, vol. 72(5), pages 923-955, December.
    28. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, Oxford University Press, vol. 130(3), pages 1117-1165.
    29. Kröll, Markus & Rustagi, Devesh, 2017. "Reputation, honesty, and cheating in informal milk markets in India," SAFE Working Paper Series 134, Leibniz Institute for Financial Research SAFE.
    30. Matthew C. Rousu & Gregory Colson & Jay R. Corrigan & Carola Grebitus & Maria L. Loureiro, 2015. "Deception in Experiments: Towards Guidelines on use in Applied Economics Research," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 37(3), pages 524-536.
    31. Daniel Zizzo, 2010. "Experimenter demand effects in economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 13(1), pages 75-98, March.
    32. Karolina Safarzynska, 2018. "The Impact of Resource Uncertainty and Intergroup Conflict on Harvesting in the Common-Pool Resource Experiment," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 71(4), pages 1001-1025, December.
    33. David J. Cooper, 1999. "Gaming against Managers in Incentive Systems: Experimental Results with Chinese Students and Chinese Managers," American Economic Review, American Economic Association, vol. 89(4), pages 781-804, September.
    34. Jordan F. Suter & Christian A. Vossler, 2014. "Towards an Understanding of the Performance of Ambient Tax Mechanisms in the Field: Evidence from Upstate New York Dairy Farmers," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 96(1), pages 92-107.
    35. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not to Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235.
    36. David R. Just & Steven Y. Wu, 2009. "Experimental Economics and the Economics of Contracts," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 91(5), pages 1382-1388.
    37. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    38. Timothy N. Cason & Charles R. Plott, 2014. "Misconceptions and Game Form Recognition: Challenges to Theories of Revealed Preference and Framing," Journal of Political Economy, University of Chicago Press, vol. 122(6), pages 1235-1270.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Leah H. Palm-Forster & Paul J. Ferraro & Nicholas Janusch & Christian A. Vossler & Kent D. Messer, 2019. "Behavioral and Experimental Agri-Environmental Research: Methodological Challenges, Literature Gaps, and Recommendations," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 719-742, July.

    More about this item

    Keywords

    Laboratory experiments; Field experiments; Methodology;

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • Q10 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - Agriculture - - - General
    • Q30 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - Nonrenewable Resources and Conservation - - - General
    • Q50 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - Environmental Economics - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:kap:enreec:v:73:y:2019:i:3:d:10.1007_s10640-018-0289-x. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.