05 Ensemble Learning
05 Ensemble Learning
◼ ENSEMBLE LEARNING
❑ learning a set of classifiers
◼ http://users.rowan.edu/~polikar/RESEARCH/PUBLICATIONS/csm06.pdf
❑ other names:
◼ multiple classifier systems
◼ committee of classifiers
◼ mixture of experts
Training data
◼ data fusion
◼ several data sets from different sources (heterogeneous data)
❑ data from each modality to the appropriate classifier, then a
combination
1. classifier selection
◼ each classifier is an expert for a certain subspace
◼ combination:
❑ the classifier closest (based on the metric) to the input vector has the
highest weight
❑ several such local experts will be allowed to vote
2. classifier fusion
◼ the whole set of classifiers learns the whole space
◼ combination:
❑ the combination of individual (WEAK) classifiers creates one (STRONG)
expert with the best performance
❑ e.g. bagging, boosting, ...
◼ diversity
❑ strategy for ensemble based systems:
◼ create many classifiers, combine their outputs
◼ the overall performance will be better than for one classifier
❑ individual classifiers must make errors for different examples (each
classifier should be unique)
◼ weak learner
❑ classifier which is to be learnt
◼ 2 taxonomies
https://cv-tricks.com/cnn/understand-resnet-
AIDP, M.Oravec, ÚIM FEI STU alexnet-vgg-inception/
Illustrations
◼ The graph shows 2 classes of 100 objects. Banana shaped classes were
used to generate data. 40% of the data was used for training, the rest
was used for testing.
◼ bpxnc classifier (back-propagation), MLP classifier.
Banana set, 3 neurons in hidden layer Banana set, more neurons in hidden layer
MLP klasifikátor, 3 neurons in hidden layer MLP classifier, 5,15 and 50 neurons in
hidden layer
mean, voting and maximum combiner mean, voting and maximum combiner