IDEAS home Printed from https://ideas.repec.org/a/gam/jftint/v14y2022i6p177-d834462.html
   My bibliography  Save this article

Evaluation of Online Teaching Quality Based on Facial Expression Recognition

Author

Listed:
  • Changbo Hou

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

  • Jiajun Ai

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

  • Yun Lin

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

  • Chenyang Guan

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

  • Jiawen Li

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

  • Wenyu Zhu

    (College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China)

Abstract

In 21st-century society, with the rapid development of information technology, the scientific and technological strength of all walks of life is increasing, and the field of education has also begun to introduce high and new technologies gradually. Affected by the epidemic, online teaching has been implemented all over the country, forming an education model of “dual integration” of online and offline teaching. However, the disadvantages of online teaching are also very obvious; that is, teachers cannot understand the students’ listening status in real-time. Therefore, our study adopts automatic face detection and expression recognition based on a deep learning framework and other related technologies to solve this problem, and it designs an analysis system of students’ class concentration based on expression recognition. The students’ class concentration analysis system can help teachers detect students’ class concentration and improve the efficiency of class evaluation. In this system, OpenCV is used to call the camera to collect the students’ listening status in real-time, and the MTCNN algorithm is used to detect the face of the video to frame the location of the student’s face image. Finally, the obtained face image is used for real-time expression recognition by using the VGG16 network added with ECANet, and the students’ emotions in class are obtained. The experimental results show that the method in our study can more accurately identify students’ emotions in class and carry out a teaching effect evaluation, which has certain application value in intelligent education fields, such as the smart classroom and distance learning. For example, a teaching evaluation module can be added to the teaching software, and teachers can know the listening emotions of each student in class while lecturing.

Suggested Citation

  • Changbo Hou & Jiajun Ai & Yun Lin & Chenyang Guan & Jiawen Li & Wenyu Zhu, 2022. "Evaluation of Online Teaching Quality Based on Facial Expression Recognition," Future Internet, MDPI, vol. 14(6), pages 1-12, June.
  • Handle: RePEc:gam:jftint:v:14:y:2022:i:6:p:177-:d:834462
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1999-5903/14/6/177/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1999-5903/14/6/177/
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fan Liu & Jiandong Fang, 2023. "Multi-Scale Audio Spectrogram Transformer for Classroom Teaching Interaction Recognition," Future Internet, MDPI, vol. 15(2), pages 1-19, February.
    2. Hongtao Zhu & Huahu Xu & Xiaojin Ma & Minjie Bian, 2022. "Facial Expression Recognition Using Dual Path Feature Fusion and Stacked Attention," Future Internet, MDPI, vol. 14(9), pages 1-17, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jftint:v:14:y:2022:i:6:p:177-:d:834462. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.