Anda belum login :: 22 Apr 2025 19:33 WIB
Detail
ArtikelEmpirical Risk Minimization for Support Vector Classifiers  
Oleh: Perez-Cruz, F. ; Navia-Vazquez, A. ; Figueiras-Vidal, A. R. ; Artes-Rodriguez, A.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 14 no. 2 (2003), page 296-303.
Topik: risk theory; risk minimization; support vector classifiers
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.7
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelIn this paper, we propose a general technique for solving support vector classifiers (SVCs) for an arbitrary loss function, relying on the application of an iterative reweighted least squares (IRWLS) procedure. We further show that three properties of the SVC solution can be written as conditions over the loss function. This technique allows the implementation of the empirical risk minimization (ERM) inductive principle on large margin classifiers obtaining, at the same time, very compact (in terms of number of support vectors) solutions. The improvements obtained by changing the SVC loss function are illustrated with synthetic and real data examples.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)