1. AdaBoost:
Key points: Construct a strong classifier by a linear combination of weak classifiers.
Update the weight of each instance based on the prediction of selected weak classifier, and the weight is determined by the prediction error. Optimal weight is obtained by the partial derivative.
Prove:
The error of hn in the (n+1) step (after re-weight) is 1/2, therefore the selected classifier will not be chosen in the next step before convergence.
2. RealBoost
A improved variant of AdaBoost with the real value output instead of binary values.
3. Implementation on MATLAB
Notes:
Plot the prediction error to make sure the learning process makes sense.
In RealBoost, remember to rerank the orders of selected features from AdaBoost.
Key points: Construct a strong classifier by a linear combination of weak classifiers.
Update the weight of each instance based on the prediction of selected weak classifier, and the weight is determined by the prediction error. Optimal weight is obtained by the partial derivative.
Prove:
The error of hn in the (n+1) step (after re-weight) is 1/2, therefore the selected classifier will not be chosen in the next step before convergence.
2. RealBoost
A improved variant of AdaBoost with the real value output instead of binary values.
3. Implementation on MATLAB
Notes:
Plot the prediction error to make sure the learning process makes sense.
In RealBoost, remember to rerank the orders of selected features from AdaBoost.