Anda belum login :: 24 Apr 2025 12:50 WIB
Detail
ArtikelPosterior Probability Support Vector Machines for Unbalanced Data  
Oleh: Tao, Qing ; Wu, Gao-Wei ; Wang, Fei-Yue ; Wang, Jue
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 16 no. 6 (Nov. 2005), page 1561-1573.
Topik: machines; posterior; probability; support vector machines; data
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelThis paper proposes a complete framework of posterior probability support vector machines (PPSVM s) for weighted training samples using modified concepts of risks, linear separability, margin, and optimal hyperplane. Within this framework, a new optimization problem for unbalanced classification problems is formulated and a new concept of support vectors established. Furthermore, a soft PPSVM with an interpretable parameter ? is obtained which is similar to the ? - SVM developed by Schölkopf et al., and an empirical method for determining the posterior probability is proposed as a new approach to determine ?. The main advantage of an PPSVM classifier lies in that fact that it is closer to the Bayes optimal without knowing the distributions. To validate the proposed method, two synthetic classification examples are used to illustrate the logical correctness of PPSVMs and their relationship to regular SVM s and Bayesian methods. Several other classification experiments are conducted to demonstrate that the performance of PPSVM s is better than regular SVM s in some cases. Compared with fuzzy support vector machines (FSVM s), the proposed PPSVM is a natural and an analytical extension of regular SVM s based on the statistical learning theory.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)