I have provided a class that implements adaptive boosting (AdaBoost) which is a very elegant technique to create a strong classifier from a set of weak classifiers. The strong classifier is the sign of the weighted average of the best N weak classifiers where N is a user specified parameter. The training process consists of N selection rounds. During each round, the training samples are given a weight distribution. The classifier selected is the one that has the least error which is equal to the weighted average of incorrectly classified samples. Training samples that have been incorrectly classified are given a larger weight for the next round. This increases the likelihood of drafting a weak classifier that can help with as yet misclassified samples.
2 people like thisPosted: 11 years ago by Suzanna