Anda belum login :: 03 Jun 2025 11:26 WIB
Detail
ArtikelStability Analysis of Discrete-Time Recurrent Neural Networks  
Oleh: Barabanov, N. E. ; Prokhorov, D. V.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 13 no. 2 (2002), page 292-303.
Topik: NEURAL NETWORKS; stability analysis; discrete - time; neural networks
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.6
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelWe address the problem of global Lyapunov stability of discrete - time recurrent neural networks (RNN s) in the unforced (unperturbed) setting. It is assumed that network weights are fixed to some values, for example, those attained after training. Based on classical results of the theory of absolute stability, we propose a new approach for the stability analysis of RNN s with sector - type monotone nonlinearities and nonzero biases. We devise a simple state - space transformation to convert the original RNN equations to a form suitable for our stability analysis. We then present appropriate linear matrix inequalities (LMI s) to be solved to determine whether the system under study is globally exponentially stable. Unlike previous treatments, our approach readily permits one to account for non-zero biases usually present in RNN s for improved approximation capabilities. We show how recent results of others on the stability analysis of RNN s can be interpreted as special cases within our approach. We illustrate how to use our approach with examples. Though illustrated on the stability analysis of recurrent multilayer perceptrons, the approach proposed can also be applied to other forms of time - lagged RNN s.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)