Anda belum login :: 27 Nov 2024 05:33 WIB
Detail
ArtikelNeural Networks  
Oleh: Lieberman, David A.
Jenis: Article from Books - E-Book
Dalam koleksi: Human Learning and Memory, page 477-503.
Topik: Neural Networks; Brains and Computers; Rescorla–Wagner Model; Concept of Dog; Explaining Life; the Universe and Everything
Fulltext: Neural Networks.pdf (526.52KB)
Isi artikelIn previous chapters we encountered examples of learning and memory that varied widely in complexity, from rats learning to press a bar at one end, to humans trying to remember lessons in physics at the other. Ideally, we would like a theory that could encompass all these forms of learning, from rats to humans, from classical conditioning to language learning. In short, a theory of everything. This might at ?rst seem an outrageous requirement – or, at any rate, one exceedingly unlikely to be ful?lled – but a theory has recently emerged that supporters claim has the potential to meet it. The new theory sets out to explain virtually every aspect of learning, from classical conditioning in animals to language learning in humans. And it does all this using just a single, almost unbelievably simple principle, that when two neurons are active at the same time, the connection between them will be strengthened. A variety of terms have been suggested to describe this new approach: connectionist, parallel-and-distributed processing, and neural network.Wewillusethe term neural network because it conveys a clearer sense of the assumption at the heart of the model, and in this chapter we will be looking at what this approach is, and how close it has come to achieving its extraordinary goal.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)