U.W. Bangor - School of Informatics - Mathematics Preprints 1998
Pattern Recognition and Fuzzy Systems
98.09 : KUNCHEVA, L.I.
On Combining Multiple Classifiers
Abstract:We consider the classification accuracy of a combination of multiple classifiers. Based on a probabilistic framework, we suggest a multiplicative aggregation connective which we call _probabilistic product_. A case study with the Satimage data from ELENA database shows that the probabilistic product aggregation is superior to some aggregation rules of the same type: majority vote, maximum, minimum, simple average, and product.
Published in:Proc IPMU'98, Paris (1998) 1890-1891.
Download:gzipped postscript: lklPMU98.ps.gz
98.10 : KUNCHEVA, L.I.
A Comparison of Supervised and Unsupervised GNPC Classifiers
Abstract:The Generalized Nearest Prototype Classifier (GNPC) is a recently proposed framework embracing various classifier paradigms. It extends the nearest prototype classifier by using ``soft'' labeling of the prototypes in the classes. Based on how the prototypes are obtained we distinguish between _supervised_ and _unsupervised_ GNPC designs. Here we study several GNPC classifiers from each group: edited nearest neighbor (1-nn), learning vector quantization model (LVQ1) and its version called ``decision surface mapping'' (DSM) as supervised designs; and clustering--relabeling, classical vector quantization (VQ), and a fuzzy unsupervised LVQ (named by its authors GLVQ-F) as unsupervised designs. An artificial data set Sine 2-d and the Satimage data set from the database ELENA were used. Our results are slightly in favor of the supervised approach in this class of GNPCs. Surprisingly, the unsupervised designs did not appear to be as inferior as it has been observed by other authors (e.g., on the IRIS data set).
98.11 : KUNCHEVA, L.I. and BEZDEK, J.C.
Presupervised and Postsupervised Prototype Classifier Design
Abstract:We extend the nearest prototype classifier to a _generalized nearest prototype classifier_ (GNPC). The GNPC uses ``soft'' labeling of the prototypes in the classes, thereby encompassing a variety of classifiers. Based on how the prototypes are found we distinguish between presupervised and postsupervised GNPC designs. We derive the conditions for Bayes-optimality of two designs where prototypes represent: (1) the components of class-conditional mixture densities (presupervised design) or (2) the components of the unconditional mixture density (postsupervised design). An artificial data set and the ``satimage'' data set from the database ELENA are used to experimentally study the two approaches. A Radial Basis Function (RBF) network is used as a representative of each GNPC type. Neither the theoretical nor the experimental results indicate clear reasons to prefer one of the approaches. The postsupervised GNPC design tends to be more robust and less accurate than the presupervised one.
Published in:IEEE Transactions on Neural Networks (to appear).
98.12 : KUNCHEVA, L.I. and LAKOV, D.V.
RBF Networks Versus Fuzzy If-Then Rules for Classification,
Abstract:We compare a prototype-based classifier implemented as a radial-basis function (RBF) network with a fuzzy if-then rule classifier. Despite the equivalence between the two schemes proven elsewhere, they differ both in philosophy and performance. We propose the name _type-B_ fuzzy systems for those based on similarity to prototypes as opposed to _type-A_ fuzzy systems whose antecedent parts are composed of atomic clauses. Type-B classifiers are preferred because: (i) they do not require decomposition of the membership functions; (ii) do not imply independence of the inputs; (iii) are easy to verify and initialize by the domain expert; (iv) their complexity does not grow with feature dimensionality; (v) are straightforwardly implementable as an RBF network. The above is supported by experimental results with three data sets (an artificial example, the two-spirals data, and the real data set Heart taken from the database Proben1).
Published in:Int J Knowledge-Based Intelligent Eng Systems, 2 (1998) 203-210.
98.13 : KUNCHEVA, L.I.
A Comparison of Techniques for Combining Multiple Classifiers
Abstract:We use Satimage and Phoneme datasets from database ELENA to compare _Fuzzy Templates_ (FT) with 9 other techniques for combining multiple classifiers: majority voting; minimum; maximum; product; ``probabilistic'' product; average; Dempster-Shafer aggregation; Behavior-Knowledge Space (BKS); and ``Naive-Bayes'' method. FT and BKS achieved highest accuracy with both data sets.
98.18 : KUNCHEVA, L.I.
A case study on two groups of CNPC classifiers
Abstract:The Generalized Nearest Prototype Classifier (GNPC) extends the nearest prototype classifier by using ``soft'' labeling of the prototypes in the classes. Based on how the prototypes are obtained we distinguish between _supervised_ and _unsupervised_ GNPC designs. We study 3 GNPC classifiers from each group using the IRIS data and the Satimage data from the database ELENA.
Published in:Proc. 6th European Congress on Intelligent Techniques and Soft Computing,
Aachen, Germany (1998) 1389-1393.
98.19 : KUNCHEVA, L.I., BEZDEK, J.C. & SUTTON, M.A.,
On combining classifiers by fuzzy templates
Abstract:We study classifier fusion by the fuzzy template (FT) technique. Given an object to be classified, each classifier from the pool yields a vector with degrees of ``support'' for the classes, thereby forming a decision profile. A fuzzy template is associated with each class as the averaged decision profile over the training samples from this class. A new object is then labeled with the class whose fuzzy template is closest to the objects' decision profile. We give a brief overview of the field to place the FT approach in a proper group of classifier combination techniques. Experiments with two data sets (satimage and phoneme) from the ELENA database demonstrate the superior performance of FT over: a version of majority voting; aggregation by fuzzy connectives (minimum, maximum, and product); and (unweighted) average.
Published in:Proc. NAFIPS'98, Pensacola, Florid, (1998) 193--197.
98.27 : KUNCHEVA, L.I. and BEZDEK, J.C.
An integrated framework for generalized nearest prototype classifier design
We propose a Generalized Nearest Prototype Classifier (GNPC) as a common
framework for a number of classification techniques. Specifically we
consider clustering-and-relabeling; Parzen's classifier; radial basis
functions (RBF) networks; learning vector quantization (LVQ) type
classifiers; and nearest neighbor rules.
To classify an unlabeled point x, the GNPC combines the degrees of similarity of x to a set of prototypes. Five questions are addressed for these GNPC families:
(1) How many prototypes do we need?
(2) How are the prototypes found?
(3) How are their class labels obtained?
(4) How are the similarities defined?
(5) How are the similarities and label information combined?
The classification performance of a set of GNPCs is illustrated on two benchmark data sets: IRIS and the 2-spirals data. We study the resubstitution error of the GNPC as a function of the number of prototypes. Our conclusions are that:
(a) unsupervised selection (or extraction) of prototypes followed by relabeling is inferior to the techniques that use labels to guide them towards prototypes;
(b) the edited nearest neighbor rule is a viable option for GNPC design which has not received the attention it deserves.
Published in:International Journal of Uncertainty, Fuzziness and Knowledge-based Systems
6 (1998) 437-457.
98.38 : AL-ZAIDAN, A.S.