Boosting improves model performance by iteratively correcting the mistakes of weak learners, whereas bagging reduces variance by training multiple models in parallel.
Step 1: Understanding Ensemble Methods
Ensemble methods are techniques in machine learning where multiple models are combined to improve performance.
Step 2: Analyzing the Given Options
- Boosting (Correct Answer): Boosting is a powerful ensemble technique that sequentially trains weak learners, adjusting their weights to correct previous errors. Examples include AdaBoost and Gradient Boosting.
- Bagging: Bagging (Bootstrap Aggregating) is another ensemble method but works by training multiple models independently in parallel, such as Random Forest.
- Pruning: Pruning is not an ensemble method; it is used to simplify decision trees by removing less important branches.
- Regret Learning: Regret learning is not commonly classified as an ensemble method.