Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • 首頁
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
  • 分類瀏覽
    • 研究成果檢索
    • 研究人員
    • 單位
    • 計畫
  • 機構典藏
  • SDGs
  • 登入
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://scholars.ntou.edu.tw/handle/123456789/19561
標題: A Video Analytic In-Class Student Concentration Monitoring System
作者: Su, Mu-Chun
Cheng, Chun-Ting
Chang, Ming-Ching
Hsieh, Yi-Zeng 
關鍵字: Face recognition;Monitoring;Visual analytics;Pipelines;Magnetic heads;Object detection;Cameras;Video analytics;deep learning;student concentration;behavior cues;face detection;landmark tracking;facial orientation;facial expre
公開日期: 1-十一月-2021
出版社: IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
卷: 67
期: 4
起(迄)頁: 294-304
來源出版物: IEEE TRANSACTIONS ON CONSUMER ELECTRONICS
摘要: 
Automatic learning feedback monitoring and analysis are becoming essential in modern education. We present a video analytic system capable of monitoring in-class student's learning behaviors and providing feedback to the instructor. It is a common practice nowadays for students to take electronic notes or browse online using laptops and cellphones in class. However the use of technology can also impact student concentration and affect learning behaviors, which can seriously hinder their learning progress if not controlled properly. In this pioneering study, we propose a non-intrusive deep-learning based computer vision system to monitor student concentration by extracting and inferring high-level visual behavior cues, including their facial expressions, gestures and activities. Our system can automatically assist instructors with situational awareness in real time. We assume only RGB color images as input and runable system on edge devices for easy deployment. We propose two video analytic components for student behavior analysis: (1) The facial analysis component operates based on Dlib face detection and facial landmark tracking to localize each student and analyze their face orientations, eye blinking, gazes, and facial expressions. (2) The activity detection and recognition component operates based on OpenPose and COCO object detection can identify eight types of in-class gestures and behaviors including raising-hand, typing, phone-answering, crooked-head, desk napping, etc. Experiments are performed on a newly collected real-world In-Class Student Activity Dataset (ICSAD), where we achieved nearly 80% activity detection rate. Our system is view-independent in handling facial and pose orientations with average angular error < 10 degrees. The source code of this work is at: https://github.com/YiZengHsieh/ICSAD.
URI: http://scholars.ntou.edu.tw/handle/123456789/19561
ISSN: 0098-3063
DOI: 10.1109/TCE.2021.3126877
顯示於:電機工程學系

顯示文件完整紀錄

WEB OF SCIENCETM
Citations

2
上周
0
上個月
checked on 2023/6/27

Page view(s)

153
上周
0
上個月
1
checked on 2025/6/30

Google ScholarTM

檢查

Altmetric

Altmetric

TAIR相關文章


在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

瀏覽
  • 機構典藏
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
DSpace-CRIS Software Copyright © 2002-  Duraspace   4science - Extension maintained and optimized by NTU Library Logo 4SCIENCE 回饋