Multiple Classifier Systems: First International Workshop, MCS 2000 Cagliari, Italy, June 21-23, 2000 Proceedings / Edition 1

Multiple Classifier Systems: First International Workshop, MCS 2000 Cagliari, Italy, June 21-23, 2000 Proceedings / Edition 1

by Josef Kittler, Fabio Roli
ISBN-10:
3540677046
ISBN-13:
9783540677048
Pub. Date:
07/26/2000
Publisher:
Springer Berlin Heidelberg
ISBN-10:
3540677046
ISBN-13:
9783540677048
Pub. Date:
07/26/2000
Publisher:
Springer Berlin Heidelberg
Multiple Classifier Systems: First International Workshop, MCS 2000 Cagliari, Italy, June 21-23, 2000 Proceedings / Edition 1

Multiple Classifier Systems: First International Workshop, MCS 2000 Cagliari, Italy, June 21-23, 2000 Proceedings / Edition 1

by Josef Kittler, Fabio Roli

Paperback

$54.99
Current price is , Original price is $54.99. You
$54.99 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Overview

Many theoretical and experimental studies have shown that a multiple classi?er system is an e?ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si?ers: -Therepresentationoftheinput(whateachindividualclassi?erreceivesby wayofinput). -Thearchitectureoftheindividualclassi?ers(algorithmsandparametri- tion). - The way to cause these classi?ers to take a decision together. Itcanbeassumedthatacombinationmethodise?cientifeachindividualcl- si?ermakeserrors'inadi?erentway', sothatitcanbeexpectedthatmostofthe classi?ers can correct the mistakes that an individual one does [1,19]. The term 'weak classi?ers' refers to classi?ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g., theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi?erseesdi?erentsectionsofthelearningset, theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi?ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e?cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi?ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi?erisrather unstable. In the present paper, westudytwodistinctwaystocreatesuchweakenedclassi?ers;i.e.learning set resampling (using the 'Bagging' approach [5]), and random feature subset selection (using 'MFS', a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi?cations to the training and/or the feature set [7,8,12,21].

Product Details

ISBN-13: 9783540677048
Publisher: Springer Berlin Heidelberg
Publication date: 07/26/2000
Series: Lecture Notes in Computer Science , #1857
Edition description: 2000
Pages: 408
Product dimensions: 6.10(w) x 9.25(h) x 0.03(d)

Table of Contents

Ensemble Methods in Machine Learning.- Experiments with Classifier Combining Rules.- The “Test and Select” Approach to Ensemble Combination.- A Survey of Sequential Combination of Word Recognizers in Handwritten Phrase Recognition at CEDAR.- Multiple Classifier Combination Methodologies for Different Output Levels.- A Mathematically Rigorous Foundation for Supervised Learning.- Classifier Combinations: Implementations and Theoretical Issues.- Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification.- Complexity of Classification Problems and Comparative Advantages of Combined Classifiers.- Effectiveness of Error Correcting Output Codes in Multiclass Learning Problems.- Combining Fisher Linear Discriminants for Dissimilarity Representations.- A Learning Method of Feature Selection for Rough Classification.- Analysis of a Fusion Method for Combining Marginal Classifiers.- A hybrid projection based and radial basis function architecture.- Combining Multiple Classifiers in Probabilistic Neural Networks.- Supervised Classifier Combination through Generalized Additive Multi-model.- Dynamic Classifier Selection.- Boosting in Linear Discriminant Analysis.- Different Ways of Weakening Decision Trees and Their Impact on Classification Accuracy of DT Combination.- Applying Boosting to Similarity Literals for Time Series Classification.- Boosting of Tree-Based Classifiers for Predictive Risk Modeling in GIS.- A New Evaluation Method for Expert Combination in Multi-expert System Designing.- Diversity between Neural Networks and Decision Trees for Building Multiple Classifier Systems.- Self-Organizing Decomposition of Functions.- Classifier Instability and Partitioning.- A Hierarchical Multiclassifier System for Hyperspectral Data Analysis.-Consensus Based Classification of Multisource Remote Sensing Data.- Combining Parametric and Nonparametric Classifiers for an Unsupervised Updating of Land-Cover Maps.- A Multiple Self-Organizing Map Scheme for Remote Sensing Classification.- Use of Lexicon Density in Evaluating Word Recognizers.- A Multi-expert System for Dynamic Signature Verification.- A Cascaded Multiple Expert System for Verification.- Architecture for Classifier Combination Using Entropy Measures.- Combining Fingerprint Classifiers.- Statistical Sensor Calibration for Fusion of Different Classifiers in a Biometric Person Recognition Framework.- A Modular Neuro-Fuzzy Network for Musical Instruments Classification.- Classifier Combination for Grammar-Guided Sentence Recognition.- Shape Matching and Extraction by an Array of Figure-and-Ground Classifiers.
From the B&N Reads Blog

Customer Reviews