Anda belum login :: 26 Nov 2024 23:56 WIB
Detail
ArtikelA Local Linearized Least Squares Algorithm for Training Feedforward Neural Networks  
Oleh: Kamen, E. ; Stan, O.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 11 no. 2 (2000), page 487-495.
Topik: algorithms; local linearized; least squares algorithm; neural networks
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.4
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelIn training the weights of a feedforward neural network, it is well known that the global extended Kalman filter (GEKF) algorithm has much better performance than the popular gradient descent with error backpropagation in terms of convergence and quality of solution. However, the GEKF is very computationally intensive, which has led to the development of efficient algorithms such as the multiple extended Kalman algorithm (MEKA) and the decoupled extended Kalman filter algorithm (DEKF), that are based on dimensional reduction and / or partitioning of the global problem. In this paper we present a new training algorithm, called local linearized least squares (LLLS), that is based on viewing the local system identification subproblems at the neuron level as recursive linearized least squares problems. The objective function of the least squares problems for each neuron is the sum of the squares of the linearized backpropagated error signals. The new algorithm is shown to give better convergence results for three benchmark problems in comparison to MEKA, and in comparison to DEKF for highly coupled applications. The performance of the LLLS algorithm approaches that of the GEKF algorithm in the experiments.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)