Anda belum login :: 23 Nov 2024 19:55 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
Two Regularizes for Recursive Least Squared Algorithms in Feedforward Multilayered Neural Networks
Oleh:
Leung, Chi-Sing
;
Tsoi, Ah-Chung
;
Lai, Wan Chan
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
IEEE Transactions on Neural Networks vol. 12 no. 6 (2001)
,
page 1314-1332.
Topik:
multilayer networks
;
regularizes
;
recursive
;
least square algorithms
;
multilayered
;
neural networks
Ketersediaan
Perpustakaan Pusat (Semanggi)
Nomor Panggil:
II36.6
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
Recursive least squares (RLS) - based algorithms are a class of fast online training algorithms for feedforward multilayered neural networks (FMNN s). Though the standard RLS algorithm has an implicit weight decay term in its energy function, the weight decay effect decreases linearly as the number of learning epochs increases, thus rendering a diminishing weight decay effect as training progresses. In this paper, we derive two modified RLS algorithms to tackle this problem. In the first algorithm, namely, the true weight decay RLS (TWDRLS) algorithm, we consider a modified energy function whereby the weight decay effect remains constant, irrespective of the number of learning epochs. The second version, the input perturbation RLS (IPRLS) algorithm, is derived by requiring robustness in its prediction performance to input perturbations. Simulation results show that both algorithms improve the generalization capability of the trained network.
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0.015625 second(s)