Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • 首頁
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
  • 分類瀏覽
    • 研究成果檢索
    • 研究人員
    • 單位
    • 計畫
  • 機構典藏
  • SDGs
  • 登入
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://scholars.ntou.edu.tw/handle/123456789/7222
DC 欄位值語言
dc.contributor.authorHuang, R. J.en_US
dc.contributor.authorTsao, C. Y.en_US
dc.contributor.authorKuo, Y. P.en_US
dc.contributor.authorLai, Y. C.en_US
dc.contributor.authorLiu, C. C.en_US
dc.contributor.authorTu, Z. W.en_US
dc.contributor.authorJung-Hua Wangen_US
dc.contributor.authorChung-Cheng Changen_US
dc.date.accessioned2020-11-20T07:04:44Z-
dc.date.available2020-11-20T07:04:44Z-
dc.date.issued2018-07-24-
dc.identifier.issn1424-8220-
dc.identifier.urihttp://scholars.ntou.edu.tw/handle/123456789/7222-
dc.description.abstractRecently, an upsurge of deep learning has provided a new direction for the field of computer vision and visual tracking. However, expensive offline training time and the large number of images required by deep learning have greatly hindered progress. This paper aims to further improve the computational performance of CNT which is reported to deliver 5 fps performance in visual tracking, we propose a method called Fast-CNT which differs from CNT in three aspects: firstly, an adaptive k value (rather than a constant 100) is determined for an input video; secondly, background filters used in CNT are omitted in this work to save computation time without affecting performance; thirdly, SURF feature points are used in conjunction with the particle filter to address the drift problem in CNT. Extensive experimental results on land and undersea video sequences show that Fast-CNT outperforms CNT by 2~10 times in terms of computational efficiency.en_US
dc.language.isoenen_US
dc.publisherMDPIen_US
dc.relation.ispartofSensorsen_US
dc.subjectvisual trackingen_US
dc.subjectconvolutional networksen_US
dc.subjectclusteringen_US
dc.subjectIoTen_US
dc.subjectobject detectionen_US
dc.titleFast Visual Tracking Based on Convolutional Networksen_US
dc.typejournal articleen_US
dc.identifier.doi<Go to ISI>://WOS:000445712400005-
dc.identifier.doi<Go to ISI>://WOS:000445712400005-
dc.identifier.doi10.3390/s18082405-
dc.identifier.doi<Go to ISI>://WOS:000445712400005-
dc.identifier.doi<Go to ISI>://WOS:000445712400005-
dc.identifier.url<Go to ISI>://WOS:000445712400005-
dc.relation.journalvolume18en_US
dc.relation.journalissue8en_US
item.openairetypejournal article-
item.fulltextno fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.grantfulltextnone-
item.cerifentitytypePublications-
item.languageiso639-1en-
crisitem.author.deptCollege of Electrical Engineering and Computer Science-
crisitem.author.deptDepartment of Electrical Engineering-
crisitem.author.deptNational Taiwan Ocean University,NTOU-
crisitem.author.deptCollege of Electrical Engineering and Computer Science-
crisitem.author.deptDepartment of Electrical Engineering-
crisitem.author.deptNational Taiwan Ocean University,NTOU-
crisitem.author.deptCenter of Excellence for Ocean Engineering-
crisitem.author.deptData Analysis and Administrative Support-
crisitem.author.orcid0000-0002-8560-6030-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCollege of Electrical Engineering and Computer Science-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCollege of Electrical Engineering and Computer Science-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCenter of Excellence for Ocean Engineering-
顯示於:電機工程學系
顯示文件簡單紀錄

Page view(s)

148
上周
1
上個月
0
checked on 2025/6/30

Google ScholarTM

檢查

Altmetric

Altmetric

TAIR相關文章


在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

瀏覽
  • 機構典藏
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
DSpace-CRIS Software Copyright © 2002-  Duraspace   4science - Extension maintained and optimized by NTU Library Logo 4SCIENCE 回饋