Difference between Boosting and Bagging

Ensemble learning needs to be discussed before knowing the difference of Bagging and Boosting. Sometimes, it is not sufficient to depend upon the results of just one machine learning model. Ensemble models combine the predictive power of multiple weak learners. The resultant is a single model which gives the aggregated Read more…

Random Forest

Random Forest: Random Forest is supervised learning and it is an ensemble classifier made using many decision tree models. Ensemble models combine the results from different models. The combination of learning models increases the overall result ,called bagging method. In simple words: Random forest builds multiple decision trees and merges Read more…

Decision Tree

Decision Tree is a graphical representation that identifies ways to split a data set based on different conditions. It is one of the most widely used and practical methods for supervised learning used for both classification and regression tasks. Decision Tree is a tree-like structure or model of decisions with Read more…

Entropy

Entropy: It defines the randomness in the data. It helps to find out the root node,intermediate nodes and leaf node to develop the decision tree It is just a metric which measures the impurity. It reaches its minimum (zero) when all cases in the node fall into a single target Read more…

Gini Impurity

Gini Impurity: It is a measure of how often a randomly chosen element from the set would be incorrectly labelled. It helps to find out the root node,intermediate nodes and leaf node to develop the decision tree It is used by the CART (classification and regression tree) algorithm for classification Read more…

Insert math as
Block
Inline
Additional settings
Formula color
Text color
#333333
Type math using LaTeX
Preview
\({}\)
Nothing to preview
Insert