Anda belum login :: 26 Nov 2024 12:50 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
Parallel Growing and Training of Neural Networks Using Output Parallelism
Oleh:
Guan, Sheng-Uei
;
Li, Shanchun
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
IEEE Transactions on Neural Networks vol. 13 no. 3 (2002)
,
page 542-550.
Topik:
Parallel Forms
;
parallel growing
;
training
;
neural networks
;
output parallelism
Ketersediaan
Perpustakaan Pusat (Semanggi)
Nomor Panggil:
II36.6
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
In order to find an appropriate architecture for a large - scale real - world application automatically and efficiently, a natural method is to divide the original problem into a set of subproblems. In this paper, we propose a simple neural - network task decomposition method based on output parallelism. By using this method, a problem can be divided flexibly into several subproblems as chosen, each of which is composed of the whole input vector and a fraction of the output vector. Each module (for one subproblem) is responsible for producing a fraction of the output vector of the original problem. The hidden structure for the original problem's output units are decoupled. These modules can be grown and trained in parallel on parallel processing elements. Incorporated with a constructive learning algorithm, our method does not require excessive computation and any prior knowledge concerning decomposition. The feasibility of output parallelism is analyzed and proved. Some benchmarks are implemented to test the validity of this method. Their results show that this method can reduce computational time, increase learning speed and improve generalization accuracy for both classification and regression problems.
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0.015625 second(s)