Anda belum login :: 23 Nov 2024 10:25 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
Model Selection Using Information Theory and the MDL Principle
Oleh:
Stine, Robert A.
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
Sociological Methods & Research (SMR) vol. 33 no. 02 (Nov. 2004)
,
page 230-260.
Topik:
Akaike Information Criterion (AIC)
;
Bayes Information Criterion (BIC)
;
Risk Inflation Criterion (RIC)
;
Cross-Validation
;
Model Selection
;
Stepwise Regression
;
Regression Tree
Ketersediaan
Perpustakaan PKPM
Nomor Panggil:
S28
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
Information theory offers a coherent, intuitive view of model selection. This perspective arises from thinking of a statistical model as a code, an algorithm for compressing data into a sequence of bits. The description length is the length of this code for the data plus the length of a description of the model itself. The length of the code for the data measures the fit of the model to the data, whereas the length of the code for the model measures its complexity. The minimum description length (MDL) principle picks the model with smallest description length, balancing fit versus complexity. Variations on MDL reproduce other well-known methods of model selection. Going further, information theory allows one to choose from among various types of models, permitting the comparison of tree-based models to regressions. A running example compares several models for the well-known Boston housing data.
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0.03125 second(s)