Anda belum login :: 23 Nov 2024 10:57 WIB
Detail
ArtikelA Modified Hopfield Auto-Associative Memory with Improved Capacity  
Oleh: Gimenez-Martinez, V.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 11 no. 4 (2000), page 867-878.
Topik: metode hopfield; hopfield; auto - associative memory
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.4
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelThis paper describes a new procedure to implement a recurrent neural network (RNN), based on a new approach to the well - known Hopfield autoassociative memory. In our approach a RNN is seen as a complete graph G and the learning mechanism is also based on Hebb's law, but with a very significant difference: the weights, which control the dynamics of the net, are obtained by coloring the graph G. Once the training is complete, the synaptic matrix of the net will be the weight matrix of the graph. Any one of these matrices will fulfil some spatial properties, for this reason they will be referred to as tetrahedral matrices. The geometrical properties of these tetrahedral matrices may be used for classifying the n - dimensional state - vector space in n classes. In the recall stage, a parameter vector is introduced, which is related with the capacity of the network. It may be shown that the bigger the value of the ith component of the parameter vector is, the lower the capacity of the [i] class of the state - vector space becomes. Once the capacity has been controlled, a new set of parameters that uses the statistical deviation of the prototypes to compare them with those that appear as fixed points is introduced, eliminating thus a great number of parasitic fixed points.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.046875 second(s)