Anda belum login :: 17 Feb 2025 10:23 WIB
Detail
ArtikelA Recurrent Log-Linearized Gaussian Mixture Network  
Oleh: Tsuji, T. ; Kaneko, M. ; Fukuda, O. ; Bu, Nan
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 14 no. 2 (2003), page 304-316.
Topik: gaussian; log - linearized; gaussian mixture network
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.7
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelContext in time series is one of the most useful and interesting characteristics for machine learning. In some cases, the dynamic characteristic would be the only basis for achieving a possible classification. A novel neural network, which is named "a recurrent log - linearized Gaussian mixture network (R - LLGMN)," is proposed in this paper for classification of time series. The structure of this network is based on a hidden Markov model (HMM), which has been well developed in the area of speech recognition. R - LLGMN can as well be interpreted as an extension of a probabilistic neural network using a log - linearized Gaussian mixture model, in which recurrent connections have been incorporated to make temporal information in use. Some simulation experiments are carried out to compare R - LLGMN with the traditional estimator of HMM as classifiers, and finally, pattern classification experiments for EEG signals are conducted. It is indicated from these experiments that R - LLGMN can successfully classify not only artificial data but real biological data such as EEG signals.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)