Anda belum login :: 23 Nov 2024 17:29 WIB
Detail
ArtikelOn Overfitting Generalization, and Randomly Expanded Training Sets  
Oleh: Karystinos, G. N. ; Pados, D. A.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 11 no. 5 (2000), page 1050-1057.
Topik: TRAINING; overfitting; generalization; training sets
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.4
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelAn algorithmic procedure is developed for the random expansion of a given training set to combat overfitting and improve the generalization ability of backpropagation trained multilayer perceptrons (MLP s). The training set is K - means clustered and locally most entropic colored Gaussian joint input - output probability density function estimates are formed per cluster. The number of clusters is chosen such that the resulting overall colored Gaussian mixture exhibits minimum differential entropy upon global cross - validated shaping. Numerical studies on real data and synthetic data examples drawn from the literature illustrate and support these theoretical developments.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)