IDEAS home Printed from https://ideas.repec.org/a/bla/ausecp/v46y2007i1p18-38.html
   My bibliography  Save this article

Systematic Influences On Teaching Evaluations: The Case For Caution

Author

Listed:
  • MARTIN DAVIES
  • JOE HIRSCHBERG
  • JENNY LYE
  • CAROL JOHNSTON
  • IAN MCDONALD

Abstract

In this paper, we examine eight years of Quality of Teaching (QOT) responses from an Economics Department in an Australian University. This is done to determine what factors, besides the instructor, have an impact on the raw average student evaluation scores. Most of the previous research on student ratings has been conducted in the US. One significant difference between US and Australian tertiary education is that, on average, the number of foreign undergraduate students in Australia is ten times the number in US institutions. We find that cultural background significantly affects student evaluations. Other factors that have an influence on the average QOT score include: year level; enrolment size; the quantitative nature of the subject; the gender of the student; fee‐paying status by gender; course of study; the differences between the course mark and previous marks; the quality of workbooks; the quality of textbooks; and the QOT score relative to those in other subjects taught at the same time. In addition, average QOT scores for instructors who have taught in a mix of subjects are similar to those based on scores adjusted to account for subject and student characteristics.

Suggested Citation

  • Martin Davies & Joe Hirschberg & Jenny Lye & Carol Johnston & Ian Mcdonald, 2007. "Systematic Influences On Teaching Evaluations: The Case For Caution," Australian Economic Papers, Wiley Blackwell, vol. 46(1), pages 18-38, March.
  • Handle: RePEc:bla:ausecp:v:46:y:2007:i:1:p:18-38
    DOI: 10.1111/j.1467-8454.2007.00303.x
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/j.1467-8454.2007.00303.x
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. White, Halbert, 1980. "A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity," Econometrica, Econometric Society, vol. 48(4), pages 817-838, May.
    2. Omer Gokcekus, 2000. "How do university students value economics courses? A hedonic approach," Applied Economics Letters, Taylor & Francis Journals, vol. 7(8), pages 493-496.
    3. Mason, Paul M. & Steagall, Jeffrey W. & Fabritius, Michael M., 1995. "Student evaluations of faculty: A new procedure for using aggregate measures of performance," Economics of Education Review, Elsevier, vol. 14(4), pages 403-416, December.
    4. L. F. Jameson Boex, 2000. "Attributes of Effective Economics Instructors: An Analysis of Student Evaluations," The Journal of Economic Education, Taylor & Francis Journals, vol. 31(3), pages 211-227, September.
    5. William Bosshardt & Michael Watts, 2001. "Comparing Student and Instructor Evaluations of Teaching," The Journal of Economic Education, Taylor & Francis Journals, vol. 32(1), pages 3-17, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Silvia Ferrini & Marco P. Tucci, 2011. "Evaluating Research Activity:Impact Factor vs. Research Factor," Department of Economics University of Siena 614, Department of Economics, University of Siena.
    2. Bredtmann, Julia & Crede, Carsten J. & Otten, Sebastian, 2013. "Methods for evaluating educational programs: Does Writing Center Participation affect student achievement?," Evaluation and Program Planning, Elsevier, vol. 36(1), pages 115-123.
    3. De Witte, K. & Rogge, N., 2009. "Accounting for exogenous influences in a benevolent performance evaluation of teachers," Working Papers 15, Top Institute for Evidence Based Education Research.
    4. Amalia Vanacore & Maria Sole Pellegrino, 2019. "How Reliable are Students’ Evaluations of Teaching (SETs)? A Study to Test Student’s Reproducibility and Repeatability," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 146(1), pages 77-89, November.
    5. Cho, Donghun & Baek, Wonyoung & Cho, Joonmo, 2015. "Why do good performing students highly rate their instructors? Evidence from a natural experiment," Economics of Education Review, Elsevier, vol. 49(C), pages 172-179.
    6. Donghun Cho & Joonmo Cho, 2017. "Does More Accurate Knowledge of Course Grade Impact Teaching Evaluation?," Education Finance and Policy, MIT Press, vol. 12(2), pages 224-240, Spring.
    7. Benjamin Artz & David M. Welsch, 2013. "The Effect of Student Evaluations on Academic Success," Education Finance and Policy, MIT Press, vol. 8(1), pages 100-119, January.
    8. Joe Hirschberg & Jenny Lye, 2014. "The influence of student experiences on post-graduation surveys," Department of Economics - Working Papers Series 1187, The University of Melbourne.
    9. Timothy A. Bodisco & Stuart Palmer, 2020. "Presentation and Evaluation of a New Graduate Unit of Study in Engineering Product Development," Sustainability, MDPI, Open Access Journal, vol. 12(14), pages 1-14, July.
    10. Berezvai, Zombor & Lukáts, Gergely Dániel & Molontay, Roland, 2019. "A pénzügyi ösztönzők hatása az egyetemi oktatók osztályozási gyakorlatára [How financially rewarding student evaluation may affect grading behaviour. Evidence from a natural experiment]," Közgazdasági Szemle (Economic Review - monthly of the Hungarian Academy of Sciences), Közgazdasági Szemle Alapítvány (Economic Review Foundation), vol. 0(7), pages 733-750.
    11. Wagner, N. & Rieger, M. & Voorvelt, K.J., 2016. "Gender, ethnicity and teaching evaluations : Evidence from mixed teaching teams," ISS Working Papers - General Series 617, International Institute of Social Studies of Erasmus University Rotterdam (ISS), The Hague.
    12. Rieger, Matthias & Voorvelt, Katherine, 2016. "Gender, ethnicity and teaching evaluations: Evidence from mixed teaching teamsAuthor-Name: Wagner, Natascha," Economics of Education Review, Elsevier, vol. 54(C), pages 79-94.
    13. Joe Hirschberg & Jenny Lye & Martin Davies & Carol Johnston, 2011. "Measuring Student Experience: Relationships between Teaching Quality Instruments (TQI) and Course Experience Questionnaire (CEQ)," Department of Economics - Working Papers Series 1134, The University of Melbourne.
    14. Joonmo Cho & Wonyoung Baek, 2019. "Identifying Factors Affecting the Quality of Teaching in Basic Science Education: Physics, Biological Sciences, Mathematics, and Chemistry," Sustainability, MDPI, Open Access Journal, vol. 11(14), pages 1-18, July.
    15. Yoav Gal & Adiv Gal, 2014. "Knowledge Bias: Is There a Link Between Students’ Feedback and the Grades They Expect to Get from the Lecturers They Have Evaluated? A Case Study of Israeli Colleges," Journal of the Knowledge Economy, Springer;Portland International Center for Management of Engineering and Technology (PICMET), vol. 5(3), pages 597-615, September.
    16. repec:zbw:rwirep:0275 is not listed on IDEAS
    17. De Witte, Kristof & Rogge, Nicky, 2011. "Accounting for exogenous influences in performance evaluations of teachers," Economics of Education Review, Elsevier, vol. 30(4), pages 641-653, August.
    18. Yoav Gal & Adiv Gal, 2016. "Knowledge Bias by Utilizing the Wording on Feedback Questionnaires: A Case Study of an Israeli College," Journal of the Knowledge Economy, Springer;Portland International Center for Management of Engineering and Technology (PICMET), vol. 7(3), pages 753-770, September.
    19. Marco p. Tucci & Sandra Fontani & Silvia Ferrini, 2008. "L’ “R-Factor”: un nuovo modo di valutare la ricerca scientifica," Department of Economics University of Siena 527, Department of Economics, University of Siena.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bruce A. Weinberg & Belton M. Fleisher & Masanori Hashimoto, 2007. "Evaluating Methods for Evaluating Instruction: The Case of Higher Education," NBER Working Papers 12844, National Bureau of Economic Research, Inc.
    2. Mohammad Alauddin & Temesgen Kifle, 2014. "Does the student evaluation of teaching instrument really measure instructorsù teaching effectiveness? An econometric analysis of studentsù perceptions in economics courses," Economic Analysis and Policy, Elsevier, vol. 44(2), pages 156-168.
    3. Joe Hirschberg & Jenny Lye & Martin Davies & Carol Johnston, 2011. "Measuring Student Experience: Relationships between Teaching Quality Instruments (TQI) and Course Experience Questionnaire (CEQ)," Department of Economics - Working Papers Series 1134, The University of Melbourne.
    4. S. Arunachalam & Sridhar N. Ramaswami & Pol Herrmann & Doug Walker, 2018. "Innovation pathway to profitability: the role of entrepreneurial orientation and marketing capabilities," Journal of the Academy of Marketing Science, Springer, vol. 46(4), pages 744-766, July.
    5. Jang, Heesun & Du, Xiaodong, 2013. "Price- and Policy-Induced Innovations: The Case of U.S. Biofuel," Journal of Agricultural and Resource Economics, Western Agricultural Economics Association, vol. 38(3), pages 1-13.
    6. Timothy Erickson & Toni M. Whited, 2000. "Measurement Error and the Relationship between Investment and q," Journal of Political Economy, University of Chicago Press, vol. 108(5), pages 1027-1057, October.
    7. Paul W. Miller & Barry R. Chiswick, 2002. "Immigrant earnings: Language skills, linguistic concentrations and the business cycle," Journal of Population Economics, Springer;European Society for Population Economics, vol. 15(1), pages 31-57.
    8. Michael K Andersson & Ted Aranki & André Reslow, 2017. "Adjusting for information content when comparing forecast performance," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 36(7), pages 784-794, November.
    9. Fors, Gunnar & Zejan, Mario, 1996. "Overseas R&D by Multinationals in foreign Centers of Excellence," SSE/EFI Working Paper Series in Economics and Finance 111, Stockholm School of Economics.
    10. Küçük, Ugur N., 2009. "Emerging Market Local Currency Bond Market, Too Risky to Invest?," MPRA Paper 21878, University Library of Munich, Germany.
    11. Akbar Ullah & Ejaz Ghani & Attiya Y. Javed, 2013. "Market Power and Industrial Performance in Pakistan," PIDE-Working Papers 2013:88, Pakistan Institute of Development Economics.
    12. Rodrigo M. S. Moita & Claudio Paiva, 2013. "Political Price Cycles in Regulated Industries: Theory and Evidence," American Economic Journal: Economic Policy, American Economic Association, vol. 5(1), pages 94-121, February.
    13. Lauren Bin Dong, 2004. "Testing for structural Change in Regression: An Empirical Likelihood Ratio Approach," Econometrics Working Papers 0405, Department of Economics, University of Victoria.
    14. Butler, Marty & Leone, Andrew J. & Willenborg, Michael, 2004. "An empirical analysis of auditor reporting and its association with abnormal accruals," Journal of Accounting and Economics, Elsevier, vol. 37(2), pages 139-165, June.
    15. Bertrand, Philippe & Lapointe, Vincent, 2015. "How performance of risk-based strategies is modified by socially responsible investment universe?," International Review of Financial Analysis, Elsevier, vol. 38(C), pages 175-190.
    16. C, Loran & Eckbo, Espen & Lu, Ching-Chih, 2014. "Does Executive Compensation Reflect Default Risk?," UiS Working Papers in Economics and Finance 2014/11, University of Stavanger.
    17. Baiyegunhi, L.J.S. & Oppong, B.B., 2016. "Commercialisation of mopane worm (Imbrasia belina) in rural households in Limpopo Province, South Africa," Forest Policy and Economics, Elsevier, vol. 62(C), pages 141-148.
    18. Valerie Cerra & Sweta Chaman Saxena, 2008. "Growth Dynamics: The Myth of Economic Recovery," American Economic Review, American Economic Association, vol. 98(1), pages 439-457, March.
    19. Guhan Subramanian, "undated". "Post-Siliconix Freeze-Outs: Theory, Evidence & Policy," American Law & Economics Association Annual Meetings 1016, American Law & Economics Association.
    20. MacKinnon, J G, 1989. "Heteroskedasticity-Robust Tests for Structural Change," Empirical Economics, Springer, vol. 14(2), pages 77-92.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:ausecp:v:46:y:2007:i:1:p:18-38. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley Content Delivery). General contact details of provider: http://www.blackwellpublishing.com/journal.asp?ref=0004-900X .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.