http://scholars.ntou.edu.tw/handle/123456789/17026
標題: | Scale Shrinking Transformation and Applications | 作者: | Yu-Chen Chen Keng-Hsuan Wu Jyun-Ting Lai Jung-Hua Wang |
關鍵字: | neural networks;Shrinking Transformation;Function Approximation | 公開日期: | 20-九月-2006 | 會議論文: | 3rd International Conference on Soft Computing and Intelligent Systems and 7th International Symposium on advanced Intelligent Systems SCIS&ISIS2006 Tokyo, Japan |
摘要: | This paper presents a novel information processing technique called scale shrinking transformation (SST). SST comprises three steps: initialization, matrix transformation, and using the column vectors of the transformed matrix as the new input vectors. The essence of SST is that the structural correlation between original inputs can be obtained. More significantly, the transformed matrix contains elements with much smaller scale variation. When applied to existing feedforward neural networks, it can alleviate problems commonly encountered in tasks of function approximation, separating nonlinearly classes, and noise filtering. When the column vectors are used as the new input to a feedforward network that comprises hidden layers, training speed can be reduced. The input scale divergence problem that plagues higher-order neural networks can also be alleviated with SST. |
URI: | https://www.jstage.jst.go.jp/article/softscis/2006/0/2006_0_698/_article/-char/ja/ http://scholars.ntou.edu.tw/handle/123456789/17026 |
DOI: | 10.14864/softscis.2006.0.698.0 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。