Anda belum login :: 27 Nov 2024 18:45 WIB
Detail
ArtikelOn The Sample Complexity of Learning for Networks of Spiking Neurons With Nonlinear Synaptic Interactions  
Oleh: Schmitt, M.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 15 no. 5 (Sep. 2004), page 995-1001.
Topik: non linear; sample complexity; learning; networks; spiking neurons; non linear; synaptic interactions
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.11
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelWe study networks of spiking neurons that use the timing of pulses to encode information. Non linear interactions model the spatial groupings of synapses on the neural dendrites and describe the computations performed at local branches. Within a theoretical framework of learning we analyze the question of how many training examples these networks must receive to be able to generalize well. Bounds for this sample complexity of learning can be obtained in terms of a combinatorial parameter known as the pseudodimension. This dimension characterizes the computational richness of a neural network and is given in terms of the number of network parameters. Two types of feedforward architectures are considered : constant - depth networks and networks of unconstrained depth. We derive asymptotically tight bounds for each of these network types. Constant depth networks are shown to have an almost linear pseudodimension, whereas the pseudodimension of general networks is quadratic. Networks of spiking neurons that use temporal coding are becoming increasingly more important in practical tasks such as computer vision, speech recognition, and motor control. The question of how well these networks generalize from a given set of training examples is a central issue for their successful application as adaptive systems. The results show that, although coding and computation in these networks is quite different and in many cases more powerful, their generalization capabilities are at least as good as those of traditional neural network models.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.046875 second(s)