Anda belum login :: 23 Apr 2025 19:21 WIB
Detail
ArtikelUnsupervised Hebbian Learning For PCA  
Oleh: Lukas.
Jenis: Article from Journal - ilmiah nasional - tidak terakreditasi DIKTI - atma jaya
Dalam koleksi: Metris: Jurnal Mesin, Elektro, Industri dan Sains vol. 6 no. 4 (Dec. 2005), page 303-307.
Topik: Hebbian Learning; Neural Networks; Principal Component Analysis.
Fulltext: Lukas-Bernard.pdf (5.53MB)
Isi artikelSanger introduced a new approach to unsupervised learning in a single-layer linear feedforward network. He proposed an optimality principle which is based upon preserving maximal information in the output unit. It is assumed that the structure of the network is such that the output layer has fewer outputs than inputs. Under these circumstances, the layer will have to preserve as much information as possible. Maximization of output information was first suggested by Linsker (1988) as a principle for designing neural networks. In this paper, an optimally trained layer is defined as one which allows for a linear reconstruction with minimal mean squared error (MSE). With this optimality principle, the optimal solution can be find in dosed form. The solution is given by the network whose weight vectors span the space defined by the first few eigenvectors of the autocorrelation matrix of the input. If the weights are the eigenvectors themselves, then the outputs will be uncorrelated and their variance wilt be maximized
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)