Anda belum login :: 23 Nov 2024 03:52 WIB
Home
|
Logon
Hidden
»
Administration
»
Collection Detail
Detail
Self-Splitting Competitive Learning : A New On-Line Clustering Paradigm
Oleh:
Zhang, Ya-Jun
;
Liu, Zhi-Qiang
Jenis:
Article from Journal - ilmiah internasional
Dalam koleksi:
IEEE Transactions on Neural Networks vol. 13 no. 2 (2002)
,
page 369-380.
Topik:
CLUSTERING
;
self - splitting
;
learning
;
clustering
;
paradigm
Ketersediaan
Perpustakaan Pusat (Semanggi)
Nomor Panggil:
II36.6
Non-tandon:
1 (dapat dipinjam: 0)
Tandon:
tidak ada
Lihat Detail Induk
Isi artikel
Clustering in the neural - network literature is generally based on the competitive learning paradigm. The paper addresses two major issues associated with conventional competitive learning, namely, sensitivity to initialization and difficulty in determining the number of prototypes. In general, selecting the appropriate number of prototypes is a difficult task, as we do not usually know the number of clusters in the input data a priori. It is therefore desirable to develop an algorithm that has no dependency on the initial prototype locations and is able to adaptively generate prototypes to fit the input data patterns. We present a new, more powerful competitive learning algorithm, self - splitting competitive learning (SSCL), that is able to find the natural number of clusters based on the one - prototype - take - one - cluster (OPTOC) paradigm and a self - splitting validity measure. It starts with a single prototype randomly initialized in the feature space and splits adaptively during the learning process until all clusters are found ; each cluster is associated with a prototype at its center. We have conducted extensive experiments to demonstrate the effectiveness of the SSCL algorithm. The results show that SSCL has the desired ability for a variety of applications, including unsupervised classification, curve detection, and image segmentation.
Opini Anda
Klik untuk menuliskan opini Anda tentang koleksi ini!
Kembali
Process time: 0.015625 second(s)