Anda belum login :: 24 Apr 2025 04:55 WIB
Detail
ArtikelStructure and Dynamics of Random Recurrent Neural Networks  
Oleh: Berry, Hugues ; Quoy, Mathias
Jenis: Article from Journal - e-Journal
Dalam koleksi: Adaptive Behavior vol. 14 no. 2 (Jun. 2006), page 129–137.
Topik: associative memory; complex networks; Hebbian learning; chaotic neural networks
Fulltext: 129.pdf (1.19MB)
Isi artikelContrary to Hopfield-like networks, random recurrent neural networks (RRNN), where the couplings are random, exhibit complex dynamics (limit cycles, chaos). It is possible to store information in these networks through Hebbian learning. Eventually, learning “destroys” the dynamics and leads to a fixed point attractor. We investigate here the structural changes occurring in the network through learning. We show that a simple Hebbian learning rule organizes synaptic weight redistribution on the network from an initial homogeneous and random distribution to a heterogeneous one, where strong synaptic weights preferentially assemble in triangles. Hence learning organizes the network of the large synaptic weights as a “small-world” one.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0 second(s)