Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • 首頁
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
  • 分類瀏覽
    • 研究成果檢索
    • 研究人員
    • 單位
    • 計畫
  • 機構典藏
  • SDGs
  • 登入
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 電機工程學系
請用此 Handle URI 來引用此文件: http://scholars.ntou.edu.tw/handle/123456789/16966
DC 欄位值語言
dc.contributor.authorJung-Hua Wangen_US
dc.contributor.authorRau, JDen_US
dc.contributor.authorPeng, CYen_US
dc.date.accessioned2021-06-03T07:11:18Z-
dc.date.available2021-06-03T07:11:18Z-
dc.date.issued2000-08-
dc.identifier.issn1083-4419-
dc.identifier.urihttp://scholars.ntou.edu.tw/handle/123456789/16966-
dc.description.abstractThis paper optimizes the performance of the GCS model [1] in learning topology and vector quantization. Each node in GCS is attached with a resource counter. During the competitive learning process, the counter of the best-matching node is increased by a defined resource measure after each input presentation, and then all resource counters are decayed by a factor or. We show that the summation of all resource counters conserves. This conservation principle provides useful clues for exploring important characteristics of GCS, which in turn provide an insight into how the GCS can be optimized. In the context of information entropy, we show that performance of GCS in learning topology and vector quantization can be optimized by using alpha = 0 incorporated with a threshold-free node-removal scheme, regardless of input data being stationary or nonstationary. The meaning of optimization is twofold: 1) for learning topology, the information entropy is maximized in terms of equiprobable criterion and 2) for Learning vector quantization, the mse is minimized in terms of equi-error criterion.en_US
dc.language.isoenen_US
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INCen_US
dc.relation.ispartofIEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICSen_US
dc.subjectcompetitive learningen_US
dc.subjectgrowing cell structuresen_US
dc.subjectneural networksen_US
dc.subjectquantizationen_US
dc.subjectself-creating networksen_US
dc.subjectshort-term memory topologyen_US
dc.titleToward optimizing a self-creating neural networken_US
dc.typejournal articleen_US
dc.identifier.doi10.1109/3477.865177-
dc.identifier.pmid18252390-
dc.identifier.isiWOS:000089118000010-
dc.relation.journalvolume30en_US
dc.relation.journalissue4en_US
dc.relation.pages586-593en_US
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.cerifentitytypePublications-
item.languageiso639-1en-
item.fulltextno fulltext-
item.grantfulltextnone-
item.openairetypejournal article-
crisitem.author.deptCollege of Electrical Engineering and Computer Science-
crisitem.author.deptDepartment of Electrical Engineering-
crisitem.author.deptNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCollege of Electrical Engineering and Computer Science-
顯示於:電機工程學系
顯示文件簡單紀錄

WEB OF SCIENCETM
Citations

2
上周
0
上個月
0
checked on 2023/6/27

Page view(s)

80
上周
0
上個月
1
checked on 2025/6/30

Google ScholarTM

檢查

Altmetric

Altmetric

TAIR相關文章


在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

瀏覽
  • 機構典藏
  • 研究成果檢索
  • 研究人員
  • 單位
  • 計畫
DSpace-CRIS Software Copyright © 2002-  Duraspace   4science - Extension maintained and optimized by NTU Library Logo 4SCIENCE 回饋