http://scholars.ntou.edu.tw/handle/123456789/19561
標題: | A Video Analytic In-Class Student Concentration Monitoring System | 作者: | Su, Mu-Chun Cheng, Chun-Ting Chang, Ming-Ching Hsieh, Yi-Zeng |
關鍵字: | Face recognition;Monitoring;Visual analytics;Pipelines;Magnetic heads;Object detection;Cameras;Video analytics;deep learning;student concentration;behavior cues;face detection;landmark tracking;facial orientation;facial expre | 公開日期: | 1-十一月-2021 | 出版社: | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | 卷: | 67 | 期: | 4 | 起(迄)頁: | 294-304 | 來源出版物: | IEEE TRANSACTIONS ON CONSUMER ELECTRONICS | 摘要: | Automatic learning feedback monitoring and analysis are becoming essential in modern education. We present a video analytic system capable of monitoring in-class student's learning behaviors and providing feedback to the instructor. It is a common practice nowadays for students to take electronic notes or browse online using laptops and cellphones in class. However the use of technology can also impact student concentration and affect learning behaviors, which can seriously hinder their learning progress if not controlled properly. In this pioneering study, we propose a non-intrusive deep-learning based computer vision system to monitor student concentration by extracting and inferring high-level visual behavior cues, including their facial expressions, gestures and activities. Our system can automatically assist instructors with situational awareness in real time. We assume only RGB color images as input and runable system on edge devices for easy deployment. We propose two video analytic components for student behavior analysis: (1) The facial analysis component operates based on Dlib face detection and facial landmark tracking to localize each student and analyze their face orientations, eye blinking, gazes, and facial expressions. (2) The activity detection and recognition component operates based on OpenPose and COCO object detection can identify eight types of in-class gestures and behaviors including raising-hand, typing, phone-answering, crooked-head, desk napping, etc. Experiments are performed on a newly collected real-world In-Class Student Activity Dataset (ICSAD), where we achieved nearly 80% activity detection rate. Our system is view-independent in handling facial and pose orientations with average angular error < 10 degrees. The source code of this work is at: https://github.com/YiZengHsieh/ICSAD. |
URI: | http://scholars.ntou.edu.tw/handle/123456789/19561 | ISSN: | 0098-3063 | DOI: | 10.1109/TCE.2021.3126877 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。