Anda belum login :: 16 Apr 2025 06:46 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
Any Reasonable Cost Function Can be Used for a Posteriori Probability Approximation
Oleh:
Saerens, M.
;
Latinne, P.
;
Decaestecker, C.
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
IEEE Transactions on Neural Networks vol. 13 no. 5 (2002)
,
page 1204-1210.
Topik:
cost
;
cost function
;
probability
;
approximation
Ketersediaan
Perpustakaan Pusat (Semanggi)
Nomor Panggil:
II36.7A
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
In this paper, we provide a straightforward proof of an important, but nevertheless little known, result obtained by Lindley in the framework of subjective probability theory. This result, once interpreted in the machine learning / pattern recognition context, puts new light on the probabilistic interpretation of the output of a trained classifier. A learning machine, or more generally a model, is usually trained by minimizing a criterion - the expectation of the cost function - measuring the discrepancy between the model output and the desired output. In this letter, we first show that, for the binary classification case, training the model with any "reasonable cost function" can lead to Bayesian a posteriori probability estimation. Indeed, after having trained the model by minimizing the criterion, there always exists a computable transformation that maps the output of the model to the Bayesian a posteriori probability of the class membership given the input. Then, necessary conditions allowing the computation of the transformation mapping the outputs of the model to the a posteriori probabilities are derived for the multioutput case. Finally, these theoretical results are illustrated through some simulation examples involving various cost functions.
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0 second(s)