Anda belum login :: 28 May 2023 08:08 WIB
A Local Linearized Least Squares Algorithm for Training Feedforward Neural Networks
Article from Journal - ilmiah internasional
IEEE Transactions on Neural Networks vol. 11 no. 2 (2000)
least squares algorithm
Perpustakaan Pusat (Semanggi)
1 (dapat dipinjam: 0)
Lihat Detail Induk
In training the weights of a feedforward neural network, it is well known that the global extended Kalman filter (GEKF) algorithm has much better performance than the popular gradient descent with error backpropagation in terms of convergence and quality of solution. However, the GEKF is very computationally intensive, which has led to the development of efficient algorithms such as the multiple extended Kalman algorithm (MEKA) and the decoupled extended Kalman filter algorithm (DEKF), that are based on dimensional reduction and / or partitioning of the global problem. In this paper we present a new training algorithm, called local linearized least squares (LLLS), that is based on viewing the local system identification subproblems at the neuron level as recursive linearized least squares problems. The objective function of the least squares problems for each neuron is the sum of the squares of the linearized backpropagated error signals. The new algorithm is shown to give better convergence results for three benchmark problems in comparison to MEKA, and in comparison to DEKF for highly coupled applications. The performance of the LLLS algorithm approaches that of the GEKF algorithm in the experiments.
Klik untuk menuliskan opini Anda tentang koleksi ini!
Copyright © 2006, 2007
Unika Atma Jaya
, all rights reserved
Process time: 0.015625 second(s)