Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • 首頁
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
  • 分類瀏覽
    • 研究成果檢索
    • 研究人員
    • 單位
    • 計畫
  • 機構典藏
  • SDGs
  • 登入
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 通訊與導航工程學系
請用此 Handle URI 來引用此文件: http://scholars.ntou.edu.tw/handle/123456789/22351
DC 欄位值語言
dc.contributor.authorChung, Yao-Liangen_US
dc.contributor.authorChung, Hung-Yuanen_US
dc.contributor.authorTsai, Wei-Fengen_US
dc.date.accessioned2022-10-04T06:12:32Z-
dc.date.available2022-10-04T06:12:32Z-
dc.date.issued2020-01-01-
dc.identifier.issn1064-1246-
dc.identifier.urihttp://scholars.ntou.edu.tw/handle/123456789/22351-
dc.description.abstractIn the present study, we sought to enable instant tracking of the hand region as a region of interest (ROI) within the image range of a webcam, while also identifying specific hand gestures to facilitate the control of home appliances in smart homes or issuing of commands to human-computer interaction fields. To accomplish this objective, we first applied skin color detection and noise processing to remove unnecessary background information from the captured image, before applying background subtraction for detection of the ROI. Then, to prevent background objects or noise from influencing the ROI, we utilized the kernelized correlation filters (KCF) algorithm to implement tracking of the detected ROI. Next, the size of the ROI image was resized to 100x120 and input into a deep convolutional neural network (CNN) to enable the identification of various hand gestures. In the present study, two deep CNN architectures modified from the AlexNet CNN and VGGNet CNN, respectively, were developed by substantially reducing the number of network parameters used and appropriately adjusting internal network configuration settings. Then, the tracking and recognition process described above was continuously repeated to achieve immediate effect, with the execution of the system continuing until the hand is removed from the camera range. The results indicated excellent performance by both of the proposed deep CNN architectures. In particular, the modified version of the VGGNet CNN achieved better performance with a recognition rate of 99.90% for the utilized training data set and a recognition rate of 95.61% for the utilized test data set, which indicate the good feasibility of the system for practical applications.en_US
dc.language.isoEnglishen_US
dc.publisherIOS PRESSen_US
dc.relation.ispartofJOURNAL OF INTELLIGENT & FUZZY SYSTEMSen_US
dc.subjectDeep CNNen_US
dc.subjectgesture recognitionen_US
dc.subjectVGGNeten_US
dc.subjectAlexNeten_US
dc.titleHand gesture recognition via image processing techniques and deep CNNen_US
dc.typejournal articleen_US
dc.identifier.doi10.3233/JIFS-200385-
dc.identifier.isiWOS:000582721200092-
dc.relation.journalvolume39en_US
dc.relation.journalissue3en_US
dc.relation.pages4405-4418en_US
dc.identifier.eissn1875-8967-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.cerifentitytypePublications-
item.languageiso639-1English-
item.fulltextno fulltext-
item.grantfulltextnone-
item.openairetypejournal article-
crisitem.author.deptCollege of Electrical Engineering and Computer Science-
crisitem.author.deptDepartment of Communications, Navigation and Control Engineering-
crisitem.author.deptNational Taiwan Ocean University,NTOU-
crisitem.author.orcid0000-0001-6512-1127-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCollege of Electrical Engineering and Computer Science-
顯示於:通訊與導航工程學系
顯示文件簡單紀錄

WEB OF SCIENCETM
Citations

10
上周
1
上個月
checked on 2023/6/27

Page view(s)

149
checked on 2025/6/30

Google ScholarTM

檢查

Altmetric

Altmetric

TAIR相關文章


在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

瀏覽
  • 機構典藏
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
DSpace-CRIS Software Copyright © 2002-  Duraspace   4science - Extension maintained and optimized by NTU Library Logo 4SCIENCE 回饋