Anda belum login :: 27 Nov 2024 11:04 WIB
Detail
ArtikelUsing Function Approximation to Analyze the Sensitivity of MLP with Antisymmetric Squashing Activation Function  
Oleh: Yeung, D. S. ; Sun, Xuequan
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 13 no. 1 (2002), page 34-44.
Topik: functional theory; function approximation; sensitivity; MLP; anti symmetric; squashing; function
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.6
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelSensitivity analysis on a neural network is mainly investigated after the network has been designed and trained. Very few have considered this as a critical issue prior to network design. Piche's statistical method (1992, 1995) is useful for multilayer perceptron (MLP) design, but too severe limitations are imposed on both input and weight perturbations. This paper attempts to generalize Piche's method by deriving an universal expression of MLP sensitivity for antisymmetric squashing activation functions, without any restriction on input and output perturbations. Experimental results which are based on, a three - layer MLP with 30 nodes per layer agree closely with our theoretical investigations. The effects of the network design parameters such as the number of layers, the number of neurons per layer, and the chosen activation function are analyzed, and they provide useful information for network design decision-making. Based on the sensitivity analysis of MLP, we present a network design method for a given application to determine the network structure and estimate the permitted weight range for network training.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)