IDEAS home Printed from https://ideas.repec.org/p/cbt/econwp/11-02.html
   My bibliography  Save this paper

More Evidence on the Use of Constructed-Response Questions in Principles of Economics Classes

Author

Abstract

This study provides evidence that constructed response (CR) questions contribute information about student knowledge and understanding that is not contained in multiple choice questions (MC). We use an extensive data set of individual assessment results from Introductory Macro- and Microeconomics classes at a large, public university. We find that (i) CR scores contain information not contained in MC questions, (ii) this information is correlated with a measure of student knowledge and understanding of course material, and (iii) CR questions are better able to “explain” academic achievement in other courses than additional MC questions. There is some evidence to suggest that this greater explanatory power has to do with the ability of CR questions to measure higher-level learning as characterized by Bloom’s taxonomy (Bloom, 1956). Both (i) the generalisability of our results to other principles of economics classes, and (ii) the practical significance (in terms of students’ grades) of our findings, remain to be determined.

Suggested Citation

  • Stephen Hickson & W. Robert Reed, 2011. "More Evidence on the Use of Constructed-Response Questions in Principles of Economics Classes," Working Papers in Economics 11/02, University of Canterbury, Department of Economics and Finance.
  • Handle: RePEc:cbt:econwp:11/02
    as

    Download full text from publisher

    File URL: https://repec.canterbury.ac.nz/cbt/econwp/1102.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. William E. Becker & Carol Johnston, 1999. "The Relationship between Multiple Choice and Essay Response Questions in Assessing Economics Understanding," The Economic Record, The Economic Society of Australia, vol. 75(4), pages 348-357, December.
    2. Randall Krieg & Bulent Uyar, 2001. "Student performance in business and economics statistics: Does exam structure matter?," Journal of Economics and Finance, Springer;Academy of Economics and Finance, vol. 25(2), pages 229-241, June.
    3. Stephen Buckles & John J. Siegfried, 2006. "Using Multiple-Choice Questions to Evaluate In-Depth Learning of Economics," The Journal of Economic Education, Taylor & Francis Journals, vol. 37(1), pages 48-57, January.
    4. repec:bla:ecorec:v:75:y:1999:i:231:p:348-57 is not listed on IDEAS
    5. William B. Walstad, 2006. "Testing for Depth of Understanding in Economics Using Essay Questions," The Journal of Economic Education, Taylor & Francis Journals, vol. 37(1), pages 38-47, January.
    6. Walstad, William B & Becker, William E, 1994. "Achievement Differences on Multiple-Choice and Essay Tests in Economics," American Economic Review, American Economic Association, vol. 84(2), pages 193-196, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. María Paz Espinosa & Javier Gardeazabal, 2013. "Do Students Behave Rationally in Multiple Choice Tests? Evidence from a Field Experiment," Journal of Economics and Management, College of Business, Feng Chia University, Taiwan, vol. 9(2), pages 107-135, July.
    2. P. Everaert & N. Arthur, 2012. "Constructed-response versus multiple choice: the impact on performance in combination with gender," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 12/777, Ghent University, Faculty of Economics and Business Administration.
    3. Neal Arthur & Patricia Everaert, 2012. "Gender and Performance in Accounting Examinations: Exploring the Impact of Examination Format," Accounting Education, Taylor & Francis Journals, vol. 21(5), pages 471-487, October.
    4. Ken Rebeck & Carlos Asarta, 2011. "Methods of Assessment in the College Economics Course," Chapters, in: Gail M. Hoyt & KimMarie McGoldrick (ed.), International Handbook on Teaching and Learning Economics, chapter 16, Edward Elgar Publishing.
    5. Ambrose & Cheryl A. Kier, 2017. "On Students’ Perception of a Multi-Scheme Assessment Method," Journal for Economic Educators, Middle Tennessee State University, Business and Economic Research Center, vol. 17(1), pages 40-52, Spring.
    6. Melanie A. Fennell & Irene R. Foster, 2021. "Test Format and Calculator Use in the Testing of Basic Math Skills for Principles of Economics: Experimental Evidence," The American Economist, Sage Publications, vol. 66(1), pages 29-45, March.
    7. McKee, Douglas & Orlov, George, 2023. "The Economic Statistics Skills Assessment (ESSA)," International Review of Economics Education, Elsevier, vol. 44(C).
    8. Steven B. Caudill & Franklin G. Mixon, 2023. "Guess for Success? Application of a Mixture Model to Test-Wiseness on Multiple-Choice Exams," Stats, MDPI, vol. 6(3), pages 1-6, June.
    9. Tang, Tommy, 2023. "Approach to learning for assessment in economics," Economic Analysis and Policy, Elsevier, vol. 78(C), pages 571-584.
    10. Nixon Chan & Peter E. Kennedy, 2002. "Are Multiple‐Choice Exams Easier for Economics Students? A Comparison of Multiple‐Choice and “Equivalent” Constructed‐Response Exam Questions," Southern Economic Journal, John Wiley & Sons, vol. 68(4), pages 957-971, April.
    11. Bagues, Manuel & Perez-Villadoniga, Maria J., 2012. "Do recruiters prefer applicants with similar skills? Evidence from a randomized natural experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 82(1), pages 12-20.
    12. Christine Jonick & Jennifer Schneider & Daniel Boylan, 2017. "The effect of accounting question response formats on student performance," Accounting Education, Taylor & Francis Journals, vol. 26(4), pages 291-315, July.
    13. Abhijit Sharma, 2015. "Use of Bloomberg Professional in support of finance and economics teaching," Cogent Economics & Finance, Taylor & Francis Journals, vol. 3(1), pages 1115618-111, December.
    14. Oskar Harmon & James Lambrinos & Judy Buffolino, 2008. "Is the Cheating Risk Always Higher in Online Instruction Compared to Face-to-Face Instruction?," Working papers 2008-14, University of Connecticut, Department of Economics, revised Sep 2010.
    15. Thompson, Alexi S. & Jager, Abigail L. & Burton, Robert O., Jr., 2012. "Do Men and Women Perform Differently on Different Types of Test Questions?," 2012 Annual Meeting, February 4-7, 2012, Birmingham, Alabama 119771, Southern Agricultural Economics Association.
    16. Yilmaz Guney, 2009. "Exogenous and Endogenous Factors Influencing Students' Performance in Undergraduate Accounting Modules," Accounting Education, Taylor & Francis Journals, vol. 18(1), pages 51-73.
    17. Douglas McKee & Steven Zhu & George Orlov, 2023. "Econ-assessments.org: Automated Assessment of Economics Skills," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 49(1), pages 4-14, January.
    18. Naser Yousef ALzoubi & Asma Shafe Assaf, 2017. "Suitability of Multiple-choice Questions in Evaluating the Objectives of Academic Educational Process of Accounting Specialization," International Business Research, Canadian Center of Science and Education, vol. 10(6), pages 46-61, June.
    19. Martin P. Shanahan & Gigi Foster & Jan H. F. Meyer, 2006. "Operationalising a Threshold Concept in Economics: A Pilot Study Using Multiple Choice Questions on Opportunity Cost," International Review of Economic Education, Economics Network, University of Bristol, vol. 5(2), pages 29-57.
    20. Allgood, Sam & Bayer, Amanda, 2016. "Measuring College Learning in Economics," MPRA Paper 85104, University Library of Munich, Germany.

    More about this item

    Keywords

    Principles of Economics Assessment; Multiple Choice; Constructed Response; Free Response; Essay;
    All these keywords.

    JEL classification:

    • A22 - General Economics and Teaching - - Economic Education and Teaching of Economics - - - Undergraduate

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cbt:econwp:11/02. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Albert Yee (email available below). General contact details of provider: https://edirc.repec.org/data/decannz.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.