Anda belum login :: 17 Feb 2025 08:26 WIB
Detail
ArtikelMutual Information Item Selection in Adaptive Classification Testing  
Oleh: Weissman, Alexander
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: Educational and Psychological Measurement vol. 67 no. 01 (Feb. 2007), page 41-58.
Topik: computerized adaptive testing; item selection; mutual information; sequential probability ratio test; classification
Fulltext: 41.pdf (491.6KB)
Isi artikelAgeneral approach for item selection in adaptive multiple-category classification tests is provided. The approach uses mutual information (MI), a special case of the Kullback- Leibler distance, or relative entropy. MIworks efficiently with the sequential probability ratio test and alleviates the difficulties encountered with using other local- and globalinformation measures in the multiple-category classification setting. Results from simulation studies using three item selection methods, Fisher information (FI), posteriorweighted FI (FIP), and MI, are provided for an adaptive four-category classification test. Both across and within the four classification categories, it is shown that in general, MI item selection classifies the highest proportion of examinees correctly and yields the shortest test lengths. The next best performance is observed for FIP item selection, followed by FI.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0.015625 second(s)