Anda belum login :: 24 Nov 2024 10:13 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
LSTM Recurrent Networks Learn Simple Context-Free and Context-Sensitive Languages
Oleh:
Gers, F. A.
;
Schmidhuber, E.
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
IEEE Transactions on Neural Networks vol. 12 no. 6 (2001)
,
page 1333-1340.
Topik:
networks
;
LSTM
;
networks
;
simple context - free
;
context - sensitive languages
Ketersediaan
Perpustakaan Pusat (Semanggi)
Nomor Panggil:
II36.6
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNN s). We demonstrate LSTMs superior performance on context - free language benchmarks for RNN s, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNN s to learn a simple context - sensitive language, namely a(n)b(n)c(n).
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0.015625 second(s)