Nowadays most of the people use either XGBoost or LightGBM or CatBoost to win the competitions at Kaggle or different Hackathons. AdaBoost is the starting steps to get in to the world of Boosting.

AdaBoost-:

  • AdaBoost,short for Adaptive Boosting, formulated by Yoav Freund and Robert Schapire, who won the 2003 Gödel Prize for their work.
  • AdaBoost algorithms can be used for both classification and regression problem.
  • It helps you combine multiple “weak classifiers” into a single “strong classifier”.
  • It works by putting more weight on difficult to classify instances and less on those already handled well.

How AdaBoosting work?

Boosting is a sequential technique which works on the principle of an ensemble.It combines a set of weak learners and delivers improved prediction accuracy.The outcomes predicted correctly are given a lower weight and the ones miss-classified are weighted higher.
Let’s understand Adaboosting with a simple illustration.

Boosting

Box-1: Model or weak learner misclassified two (+).

Box-2:This classifier gives more weight to two (+) misclassification or reduces the error of previous classifier but misclassified two(-).

Box-3: This classifier gives more weight to two (-) misclassification or weak learner reduces the error of previous classifier but misclassified two(+) and one(-).

Box-4: It is a combination of all weak learners Box1,Box2,Box3 and did a good job to classify the all(+ or -).

Steps and Terminology in AdaBoost :

  • The trees are usually one node and two leaves in Adaboost.It is also called the Stump.It form the forest of Stumps.
  • The stumps are usually not good in making classification.
  • The errors made by the first stump influence how the second stump formed.It means second stump reduces the error generated by first stump,similarly third stump reduces the error made by second stump and so on.
  • As we have seen classifier give more weights to misclassified points and correct in next stump.
  • Weak learners or stumps combine into strong learner classifier that predicts very accurately.

Implementation:

AdaBoost Implementation

Please find full implementation here.


0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Insert math as
Block
Inline
Additional settings
Formula color
Text color
#333333
Type math using LaTeX
Preview
\({}\)
Nothing to preview
Insert